en.planet.wikimedia

September 02, 2014

Jeroen De Dauw

Wikibase DataModel 1.0

I’m happy to announce the 1.0 release of Wikibase DataModel. Wikibase DataModel is the canonical PHP implementation of the Data Model at the heart of the Wikibase software.

This is a big release which has been some time in the making, even though many additions have been split of and included in previous releases. The highlights are as follows:

Removal of the (de)serialization code

The entities and value objects in Wikibase DataModel used to have toArray and newFromArray methods. This caused several problems, such as having a pile of static code, depending on configuration (which was done via global state) and adding support for an arbitrary array format to the responsibilities of the objects. This has been fully removed, and can now be done via the dedicated serialization components (Public format, internal format) which where released some time back.

In earlier versions of Wikibase DataModel, the Item and Property classes contained the array representation internally rather than fields for the value objects items and properties contain. While this was not visible via the getters and setters, which dealt with those value objects, it was exposed in the constructor. As of DataModel 1.0, the constructors take the value objects rather than the array representation.

Deprecation of Entity

Type hinting against the Entity class has been deprecated. This was announced on the Wikidata tech list some time back. While the class is still there, most methods it defines have been deprecated, and some have been removed. A new EntityDocument interface has been introduced in version 0.8.2, which can be used instead. As part of the cleanup here, Item has been made to only accept statements, rather than all claims, as it wrongly did before.

 

Many more changes and additions where made. You can view the full list of changes affecting users of the component in the release notes.

by Jeroen at September 02, 2014 04:33 PM

Wikimedia UK

Your pictures on one of the busiest websites in the world?

The image is the logo of Wiki Loves Monuments and features a white jigsaw puzzle piece on a red background

This post was written by Michael Maggs, a volunteer organiser of Wiki Loves Monuments (and Wikimedia UK Chair)

September is your chance to take part in the annual photography competition to improve Wikipedia. The encyclopaedia is visited by 500 million people every month, and is seeking help from RPS members improve its photos.

Wiki Loves Monuments is aimed at improving coverage of the UK’s listed buildings and ancient monuments, and starts on Monday 1st September. The contest is supported by the Royal Photographic Society, English Heritage, and Wikimedia UK (the UK charity that supports Wikipedia and its sister projects).

We’ve got lots of pictures of Tower Bridge and Stonehenge, but there’s so much more of the country’s heritage to celebrate. There are tens of thousands of eligible sites, so check out the UK competition website and see what’s nearby. As well as prizes for the best image, we have a special prize this year for the best image of a listed building on one of the ‘At Risk’ registers.

It doesn’t matter when your photos are taken so long as they are uploaded during September 2014. If you took some stunning pictures back in April, or five years ago, you can still upload them.

In line with the charitable and educational aims of the contest, you’ll need to agree to release your entries under a free licence allowing them to be freely used by anyone for any purpose, including Wikipedia. You retain copyright, and can require anyone using your images to attribute them to you as photographer.

Help us show off your images of your local history!

The competition is open until Tuesday 30 September. You can see full details of how to enter here.

by Stevie Benton at September 02, 2014 01:53 PM

September 01, 2014

Wikimedia DC

Congress edits Wikipedia: Our perspective as Wikipedians in the nation’s capital

A screenshot of the CongressEdits Twitter feed from September 1, 2014.

By Peter Meyer and James Hare

This past July, programmer Ed Summers created CongressEdits, a Twitter feed that posts an update every time an edit to Wikipedia is made anonymously from an IP address belonging to the United States Congress. Wikipedians who edit through a registered account have their edits attributed to their username, while those who edit without being logged in have their edits attributed to their IP address. The range of IP addresses used by Congressional offices is public knowledge, and the Twitter bot reports only those where the person posting wasn’t logged in. In fact, Wikipedia administrators have been watching out for Congressional edits for years.

CongressEdits provided a new level of visibility to these edits. The Twitter account has around 30,000 followers as of writing; by comparison, the English-language Wikipedia has 1,400 administrators. The visibility and resulting press coverage generated a lot of interest in Wikipedia on the Hill—particularly since some of the edits are disruptive (and sometimes downright hateful). That said, they are mostly the kind of juvenile or disruptive edits that Wikipedia deals with every minute of every day without incident, notable only because of where the edits are coming from. Over the years Wikipedia has developed sophisticated technologies, including filters that prevent certain edits from even happening, that ensure that most trivial vandalism gets swiftly undone.

Most press coverage of CongressEdits has focused on acts of vandalism, and one would think we would want to chase Congressional staff away. In fact, Wikimedia DC welcomes edits by Congressional staff and the staffs of federal government agencies. Government staff are experts in areas of public interest, including very new hot topics. They play a promising role in our mission to make a better online reference work, with notable, neutrally phrased, verifiable content. We can overlook minor discretions and work with Capitol Hill and all federal employees to forge a path forward.

Recently we partnered with the Cato Institute for a panel on editing Wikipedia on Capitol Hill. You can read about it inU.S. News and World Report. Cato and Wikimedia DC both agree that Congress does have a part to play in Wikipedia—not political advocacy, but transparently improving the quality of information about legislation and other Congressional activity. This includes not just direct edits to articles, but making data about government more open and machine-readable for reuse in highly visible third-party platforms like Wikipedia. There is a great potential for Wikipedia as a platform to increase awareness of Congress’ activities, a potential we should not overlook.

Best practices for federal employees

Wikimedia DC is interested in developing best practices for employees at all levels of government. The National Archives and Records Administration (NARA) has been working with the Wikipedia community since 2011, pioneering government engagement with Wikipedia and showcasing the potential to serve the public.

If you or your agency are interested in participating as a Wikipedia editor, we recommend these basic best practices:

  • Register individual accounts. By registering an account, it helps you develop goodwill with the Wikipedia community. Fellow editors get the sense that they are working with another person, not a shadowy figure hiding behind an IP address. However, Wikipedia’s policies do not permit the registration of group or company accounts; each account must be used by one person only.
  • Acknowledge your potential conflicts of interest. The community of volunteers that maintains Wikipedia cares very strongly about potential conflict of interest. To this end, avoid editing articles on your boss or your employer. Additionally, being transparent about your affiliation can help build trust. NARA has a standard format for conflict-of-interest disclaimers, a format which can be freely copied and re-used by others in the federal government.
  • Look into other agencies’ best practices. Some agencies have published best practices on Wikipedia participation, including NARAthe Department of Health and Human Services, and the National Institutes of Health. These are best practices you may wish to incorporate, should you have the opportunity to develop best practices for your own agency. We also recommend reading Why CongressEdits Matters for Your Agency on DigitalGov.

Peter Meyer is the Treasurer of Wikimedia DC and the Chair of Wikimedia DC’s Public Policy Committee. James Hare is the President of Wikimedia DC.

by James at September 01, 2014 09:06 PM

Wikimedia UK

Welcoming our Programme Intern

Photo shows Roberta Wedge in the Wikimedia UK office

Roberta Wedge, Wikimedia UK Programme Intern

This section was written by Daria Cybulska, Programme Manager

One of Wikimedia UK’s key aims as a charity is to teach under-represented groups how to edit Wikipedia (women make up about 10% of editors), and develop under-represented content (e.g. Women in Science). Wikimedia UK has been running ‘Women in Science’ editathons for the last two years – one of the first ones was the much acclaimed Royal Society event to celebrate Ada Lovelace Day in 2012  ) – as a part of the wider Ada Lovelace Day celebrations.

In 2013 our editathons have expanded and received extremely positive responses from the attendees and in general. They were organised with a strong support from the Medical Research Council, which enabled us to deliver events in partnerships with other organisations who hosted them and invited people from their networks to attend. Since then we have been contacted by various organisations interested in collaborating with us further.

Thanks to the popularity of these activities we decided to give more capacity for organising these diversity events (logistics can take a lot of time and effort!), and perhaps even growing the group of people who are interested and keen to be involved in this programme.

This leads me to welcoming Roberta Wedge, our Programme Intern, who is joining us for four months to particularly focus on Ada Lovelace 2014, but also support the gender gap activities in general. (To learn more about the role visit this page.)

This section was written by Roberta Wedge, Programme Intern

Wikipedia is a miracle of human ingenuity and vision and hard work. It can transform lives, and perhaps even save them, as with the recent Ebola initiative. It is also fraught with human difficulties and limitations. One result of that – and one of the worst or most worrying aspects of Wikipedia, from my perspective – is that the vast majority of editors are male, with all the ramifications that that brings. If women’s voices are not heard, and women’s stories are not told, the world as a whole is the poorer. The same goes for every under-represented group.

One of the best and most heartening aspects of Wikimedia UK (and, from what I know of them, other chapters and the Foundation too) is the acknowledgement that this gender gap is a problem, and the commitment to changing the situation. There’s a relevant parallel here. Educators and employers in STEM fields (science, technology, engineering, mathematics) know that they have to work intelligently to build the pipeline (encourage girls in) and stop the leaks (keep women in the workforce). Just as women in STEM are rare but then, so are women in Wikimedia projects less likely to join and more likely to leave.

I’ll be working on this with Wikimedia UK until the end of the year. One of the main things I want to do is organise editathons, and possibly other events, to engage more women to edit, and to encourage everyone to edit related subjects. The biographies of women in science are an obvious starting point. I expect I’ll be approaching GLAMs, universities, and learned societies, both existing and new partners, as potential hosts.

Once Ada Lovelace Day is over, there’s Women’s History Month on the horizon. Aside from organising events, and finding ways to persuade those of you reading this to set up your own events, I want to collect ideas that might help structural change. One example: a volunteer (who I won’t name, without his permission) mentioned in passing that for each biography of a man that he creates, he makes a point of creating at least one about a woman. It’s a simple step, but it makes a difference.

If you have any ideas, please get in touch.

by Stevie Benton at September 01, 2014 05:06 PM

Joseph Reagle

Wikipedia's citation mess and how to cope

Wikipedia citations and bibliographies are a confusing mess. This just isn't the case for newbies, but also experienced academics. In "Wikipedia's Citation Mess and How to Cope" I explain some sources of confusion and recommend a better (but uncommon) approach to using citations at Wikipedia.

by Joseph Reagle at September 01, 2014 04:00 AM

Tech News

Tech News issue #36, 2014 (September 1, 2014)

TriangleArrow-Left.svgprevious 2014, week 36 (Monday 01 September 2014) nextTriangleArrow-Right.svg
Other languages:
বাংলা • ‎čeština • ‎English • ‎فارسی • ‎suomi • ‎français • ‎עברית • ‎italiano • ‎日本語 • ‎português • ‎українська • ‎中文

September 01, 2014 12:00 AM

August 31, 2014

Wiki Loves Monuments

Wiki Loves Monuments 2014 Has Started

Wiki Loves Monuments 2014 Has Started – Good Luck to all participants and all new participating countries.

Wiki Loves Monuments is an annual event which takes place across the globe every September,  for the past five years.

The competition is designed to bring together people who value their local historic environment with amateur and professional photographers alike to capture images of the world’s historic monuments.

These photos are then shared with the world under free licences via Wikimedia Commons, a free media repository which amongst other things provides most of the images for Wikipedia.

by Deror Lin at August 31, 2014 08:02 PM

Gerard Meijssen

#Wikidata - my #workflow enriching Wikidata using tools

As I have other commitments, I do not have the same amount of time to do what I used to do. The workflow I use is now quite stable and dependable so I am happy to publish it. It is fairly easy and obvious. You can do this too.

Important are objectives; mine are:
  • make Wikidata more informative by adding relevant statements
  • Provide the basis for further usage of data
My workflow is based on the people who died in 2014. This is reported in categories. ToolScript informs me about all items that do not have a date of death. Every line represents an item; typically they are human but there are also horses and other critters included. I click the Reasonator icon and, the links to articles provide me with the first lines of that article. Typically the date of birth and death are included. I copy this text when it is not English and use Google translate. From the translated text I copy the dob dod. I click on the Qnumber in the Reasonator and add these dates in Wikidata.

The ToolScript can easily point to 2013 or any other year. Obviously you can make your own script to do whatever.

Once somebody is a registered dead, I look at the article for interesting categories. They can be anything from "Alma mater university x" to "player of Whatever FC". Most interesting are the implied facts NOT reported from the dearly departed. Any category may contain hundreds of other items for whom we are not aware about said fact. The first thing to do is to document said category, this category can be on any wiki. Documenting is done by including a statement with "is a list of" "human" and have a qualifier like "alma mater" "University X". Reasonator will show at most the first 500 entries of the resulting query.

When many entries are still missing, Autolist2 is the tool to use. From the Reasonator page of the category, copy the name of the category, the P and the Q value to the appropriate spot. Do not forget to make sure that the right Wiki has been selected (en in the example). Consider the depth; depth 0 is safest. Make sure that the WDQ mode is on "AND" and press "Run". This will generate the list that is selected for processing. Check the list and copy the P and Q values to the control box. Click "Process commands" when you feel comfortable with the results. Once the process starts, you will find the changes in the Reasonator page for the item you add statements for, in the example of the illustration it is the New Zealand Order of Merit

For best results most entries are often in the "local language" like this example for people who work(ed) at the university of Innsbruck.

With a workflow like this you are more effective. The work is documented and slowly but surely Wikidata becomes truly informative.
Thanks,
     GerardM

by Gerard Meijssen (noreply@blogger.com) at August 31, 2014 07:55 AM

August 29, 2014

Wikimedia Foundation

Evaluation Portal on Meta: A Redesigned Space for Learning

Heading - Evaluation portal.png

Just over one year ago, the Wikimedia Foundation started talking about evaluating programs like Wiki Loves Monuments and the Wikipedia Education Program. The goal was to grow support for program leaders to evaluate activities and outcomes that would lead to learning and improving the effectiveness of their programs.

As we have engaged in this work, the collection of evaluation resources has grown significantly. In order to better support program leaders and the broader community in learning about evaluation, we had to reimagine our pages on meta. We are happy to introduce you to the newly redesigned evaluation portal!

Plan screenshot - Evaluation portal.png
Contact us screenshot - Evaluation portal.png
Top page - Evaluation portal.png
Evaluation at Wikimedia screenshot-Evaluation portal.png
Upcoming events screenshot - Evaluation portal.png

Improved organization

The new portal has four main sections with evaluation resources: Study, Plan, Measure and Share. Two other sections, Connect and News & Events, are spaces for networking within the evaluation community through talk pages, online and face-to-face events. We’d like to take a moment to explain these sections and how they may be useful for anyone who wants to evaluate their programs.

Study. Program evaluation is an academic field, with its own language and theory that can be studied. The Study section has resources to guide new evaluators with the vocabulary, theory and strategies related to evaluation in the Wikimedia movement.

The Glossary is one of the most valuable pages that defines some of the key terms that may be used in conversations about program evaluation. Explanations for phrases like program leader or program implementation, are found here. With evaluation, it can often help to read what others have done. You go through examples about how evaluation fits within the movement in Evaluation in the Wikimedia Movement: Case studies. Step-by-step guides called Learning modules walk through resources and tools for evaluating a program. Some of the topics include writing surveys and using Wikimetrics.

Plan. Evaluating a program means to plan in advance. This section of the portal is designed to include the important steps to planning an evaluation: identifying goals, choosing targets for those goals and deciding which metrics to use for measuring those targets.

Choosing Goals and Measures provides guidance for setting outcome targets. Once you identify your goal (or goals), you might review Program Resources as a most basic guide of best practices and associated program goals and metrics. If your program is slightly different, or if you are creating a new program, the Logic Model is a great process to map your program’s or project’s vision. Explore Learning Patterns related to implementation to learn how to collect usernames, how to ask about gender, or how to advertise a program.

Measure. In order to evaluate a program you must know what and how you will measure progress toward your goals. The Measure section can help: it provides strategies for collecting and keeping track of data.

Tracking and monitoring can capture data for telling the story of a program, how the program is working and where improvements might be needed. The Reporting and Tracking Toolkit offers guidance and templates for tracking a program, from the inputs, like hours or money, to the outputs, like t how many participants and the outcomes, like the number of editors retained. Wikimetrics is a useful tool for easily measuring user contributions on Wikimedia projects. Meanwhile, surveys can measure participant’s attributes (e.g. gender, hometown), attitudes (e.g. motivation to edit), or behaviors (e.g. how many times they edit per week). The Survey Question Bank is a repository for questions searchable by program goal or the survey goal and Qualtrics, an online survey platform, is a tool program leaders may access for large-scale online surveys.

Share. A key aspect of learning and evaluation is sharing what you know. This section is the portal space where the entire community can share results of activities and evaluations related to Wikimedia programs.

Writing and sharing reports can be very helpful for learning from one another about evaluation strategies. Evaluation Reports (beta) is an initial collection of program impact reports that provides many details on the process and ways to analyze data. Program leaders can also read or post Case Studies to show the work they have done. In addition to sharing reports, it is great to share tips or solutions to problems you have found along the way. Creating or endorsing Learning Patterns are great ways to reflect and share with your peers.

Better spaces for Communication

Connect is a space for the evaluation community to talk about evaluation, metrics, programs and to meet one another.

If you are involved in planning, implementing, or evaluating Wikimedia projects and programs, add your photo to the Community section and share which programs you have been involved in. If you want to ask a question about evaluation, this is the place to post it on-wiki.

News and Events is for the Learning and Evaluation team to post upcoming events we are hosting, or hear about from community members, related to Wikimedia learning and evaluation.

We frequently host Virtual Meet-ups and training events to build our shared knowledge around programs, measurement and evaluation. Follow this page to keep up with upcoming events and learning opportunities!

Visit the Portal @ meta:Grants:Evaluation

While the sections and resources in the portal will continue to develop, we hope that the new organization will help all of us better navigate the useful content that is held there. Please visit the portal and let us know how it can help you! Also feel free to post us any feedback about the site’s organization or content.

As always, email eval@wikimedia.org if you have any questions!

by carlosmonterrey at August 29, 2014 10:19 PM

Wikimedia UK

Castles in the digital age

Clem Rutter’s photo of Rochester Castle (worth clicking to view larger)

When you spend time on one of the busiest websites in the world it’s amazing what patterns emerge.

A few weeks ago I was leafing through a borrowed copy of The Historian. It had been passed on to me because there was a piece about castles. As I leafed through its immaculately presented pages I was stopped by an eerily familiar photo. There was Rochester Castle on a beautiful sunny day, a sky blue backdrop, and the medieval cathedral peeking out behind.

That stopping power was important. For me at least, a good photograph makes me want to learn more, especially on Wikipedia where a plethora of links can drag you into a maze full of interesting twists and turns.

I knew where that snapshot came from. It was unmistakably the main photo on the Wikipedia article about the castle. I was also lucky enough to have met the man responsible for it. The photographer is Clem Rutter who has more than a decade’s experience of writing for Wikipedia, and apparently a decent photographer to boot.

It was an exciting moment of recognition, mixed with a bit of pride that The Historian was happy to use the picture. I decided to send Clem the magazine so he could see how good it looked in print, where it illustrated a piece by a professor of history. But this blog isn’t about the magazine. I want to say thank you Clem for taking that photo.

I hope you admire the picture as much as I do.

Have you been inspired to emulate Clem? Wiki Loves Monuments 2014 starts on 1 September, but you can take pictures in advance so go out and get snapping!

by Richard Nevell at August 29, 2014 07:27 PM

Priyanka Nag

Maker Party Bhubaneshwar

Last weekend I had a blast in Bhubaneshwar. Over two days, I was there at two different colleges for two Maker parties.

Saturday (23rd August 2014), we were at the Center of IT & Management Education (CIME) where we were asked to address a crowd of 100 participants whom we were supposed to teach webmaking. Trust me, very rarely do we get such crowd in events where we get the opportunity to be less of a teacher and more of a learner. We taught them Webmaking, true, but in return we learnt a lot from them.

Maker Party at Center of IT & Management Education (CIME)

On Sunday, things were even more fabulous at Institute of Technical Education & Research(ITER), Siksha 'O' Anusandhan University college, where we were welcomed by around 400 participants, all filled with energy, enthusiasm and the willingness to learn.

Maker Party at Institute of Technical Education & Research(ITER)

Our agenda for both days were simple....to have loads and loads of fun! We kept the tracks interactive and very open ended. On both days, we did cover the following topics:
  • Introduction to Mozilla
  • Mozilla Products and projects
  • Ways of contributing to Mozilla
  • Intro to Webmaker tools
  • Hands-on session on Thimble, Popcorn and X-ray goggles and Appmaker
Both days, we concluded our sessions by giving away some small tokens of appreciation like e T-shirts, badges, stickers etc, to the people who had been extra awesome among the group. We concluded the awesomeness of the two days by cutting a very delicious cake and fighting over it till its last pieces.
 
Cake.....
Bading goodbye after two days was tough, but after witnessing the enthusiasm of everyone we met during these two events, I am very sure we are going to return soon to Bhubaneshwar for even more awesomeness.
 
A few people who are two be thanked for making these events sucessful and very memorable are:
  1. Sayak Sarkar, the co-organizer for this event.
  2. Sumantro, Umesh and Sukanta from travelling all the way from Kolkata and helping us out with the sessions.
  3. Rish and Prasanna for organizing these events.
  4. Most importantly, the entire team of volunteers from both colleges without whom we wouldn't havebeen able to even move a desk.
 p.s - Not to forget, we did manage to grab media's attention as well. The event was covered by a local newspaper.
The article in the newspaper next morning

by priyanka nag (noreply@blogger.com) at August 29, 2014 04:09 PM

Gerard Meijssen

#Wikidata - Adolf Butenandt, Nobel laureate, professor and student

For many professors we know in Wikidata that they are or have been employed by what university. Data about this has been added categories at a time. Often this has been repeated for categories about the same university from different Wikipedias.

At the same time information has been added for the universities where people studied. However, there is an increasing number of professors for whom it is not known where they studied.

Professor Butenandt is a case in point; he studied at the university of Marburg and the university of Göttingen. It is known on one Wikipedia and not on others. Given that categories are linked as well, it is fairly easy to signal missed opportunities.


Thanks to this query by Magnus, we know about 23,351 professors without an alma mater. For Mr Butenandt information has been or will be added and, obviously there is much more work left to do.
Thanks,
     GerardM


by Gerard Meijssen (noreply@blogger.com) at August 29, 2014 06:28 AM

August 28, 2014

Wikimedia Foundation

Venerable cultural institution partners with Wikimedia Serbia

Matica srpska building in Novi Sad

The Matica Srpska (MS) and Wikimedia Serbia (WMRS) are joining forces for an exciting new endeavor to digitalize all of the contents of at least two Serbian dictionaries over the next year, including the Serbian ornithological dictionary, and the dialects of Vojvodina dictionary. What is even more exciting for the free culture movement is this collaboration with Serbia’s oldest cultural and scientific institution, and how it came to be.

Founded in 1826, the Matica – which has become a Slavic symbol for an institution that promotes knowledge – was the nexus point for fostering the Serbian national identity and enlightenment during the days of the Ottoman and later Habsburg rule. Today, it still serves as an important center of Serbian culture, housing departments for Natural Sciences, for Performance Arts and Music, Lexicography and more. Additionally, the Matica Srpska acts as an art gallery for eighteenth and nineteenth century paintings, a library containing over 3.5 million books and a publishing house for ten periodicals and, of course, an array of Serbian dictionaries and encyclopedias.

Milos Rancic, the first president of Wikimedia Serbia, believes that this is a historical feat for Serbian culture and Wikimedia.

Logo of Wikimedia Serbia

“The significance of this cooperation for Wikimedia is that we are at the beginning of a close relationship with a national, cultural institution, whose foci include dictionaries and encyclopedias. They share our goals and want to cooperate with us.”

But how was Milos able to lay pavement on a potentially ground-breaking agreement between WMRS and MS? The answer: Micro-grants.

Back in June, WMRS received an interesting proposal for its micro-grants program. The project was about creating a photograph gallery of a single person over time. The project was later deemed unsuitable for the grant; but Milos, still intrigued by the concept of the project, decided to fund it personally.

By chance, this amateur photographer just so happened to be a top Serbian lector, an editor of the Orthography of the Serbian language and a lexicographer at the Matica Srpska. The two men proceeded to talk on a number of topics, including photography, the financial state of the MS and its desire to have more initiatives.

“I had bold ideas, of course, but I was quite skeptical about the possibility of cooperation between WMRS and Matica Srpska,” Milos admitted.

Image of Milos (left) taken at the Third regional conference of Wikimedia Serbia in Belgrade

“However, he convinced me that the president of MS is likely willing to cooperate and that we should talk about that.”

A meeting was scheduled, and a few weeks later, a delegation comprised of Mile Kis, Executive Director of WMRS, Ivana Madzarevic, WMRS program manager and Milos entered into initial talks with the Matica Srpska.

The meeting lasted two hours. Then, both parties dispersed.

Weeks went by without confirmation from MS.

It was not until July 16 that word arrived. “We got a formal letter from MS, which summarized our meeting and emphasized their commitment to accessibility of knowledge to as many people as possible.”

Milos notes that small, deliberate steps are necessary in order to achieve lasting results. “This is just the beginning, of course. We share important traits with these institutions like MS. It’s about long term goals. We want to start cooperation and develop it. They want to share their content on the Internet. With our (technological, licensing, etc.) help, they will become the institution which share their content by default, no matter if we are involved or not.”

Over the course of the next couple of years, Milos hopes to begin discussing uploading the main Serbian dictionaries too.

Milos says that one cannot overestimate the efficacy of having a grants program, no matter the size. “When you are going outside and are telling people that you are willing to support their projects, it could lead into some interesting outcomes. It is important to understand the possibilities that could be opened and catch them.”

Michael Guss, Communications volunteer at the Wikimedia Foundation

by carlosmonterrey at August 28, 2014 11:08 PM

Priyanka Nag

Maker Party gets grander in Pune this time

While going through my twitter time-line this evening, I noticed Michelle Thorne's tweet stating that India leads with most Maker Party action this season.
Well, who doubts that! In India, we have a maker parties being organized almost every second day. My facebook wall and twitter timelime is like overloaded with posts, photos and updates from all the Maker Parties happening around me.

Maker Party, Pune


Well, if you are still not aware of this one, we are having the grand daddy of this maker parties in Pune on the 6th of September 2014. The executive director of Mozilla Foundation, Mark Surman, is going to be personally present for this event. Just like all maker parties, this event is an attempt to map and empower a community of educators and creative people who share a passion to innovate, evolve and change the learning landscape.

A few quick updates about this event:
  •  Event date - 6th and 7th September
  •  Event venue - SICSR, Model Colony, Pune
  • Rough agenda for the event is going to be:
    • 6th September 2014 (Day 1) 
      • 10am - 11am : Mozilla introduction
      • 11am - 12 : About Hive initiative
      •  12 - 1pm: Rohit Lalwani - Entrepreneurship talk
      •  1-2pm : Lunch break
      •  2pm - 3pm: Webmaker begins with Appmaker
      •  3pm - 4pm: Webmaker continues with Thimble
      •  4pm - 4.45pm: Webmaker continues with Popcorn
      •  4.45pm - 5.30pm : Webmaker continues with x-ray goggles
      • 5.30pm - 6pm: Prize distribution (against best makes of the day etc). Science fair also ends
      • 6pm - 7pm : Birds of feature
      • 7pm : Dinner (venue - TBD)
Science fair will be from 12 noon to 6pm.
  
    •  7th September 2014 (Day 2) 
      • 1st Half: Community Meetup and Discussions on the future roadmap for Hive India,
        Long term partnership prospect meeting with partners.
      •  2nd Half: Community training sessions on Hive and Train the trainer events.
 
For this event, we are having a variety of different training sessions, workshops and science displays - starting from 3D printing to wood-works, Origami to quad-copter flying and even film making.

If you have still not registered for this event, heres your chance:


<iframe frameborder="0" height="500" marginheight="0" marginwidth="0" src="https://docs.google.com/forms/d/12lD2Rloz7QlhpNPrnZkfoEP0m0UjYZDJtyaRul5YahM/viewform?embedded=true" width="760">Loading...</iframe>

by priyanka nag (noreply@blogger.com) at August 28, 2014 06:39 PM

Andy Mabbett (User:Pigsonthewing)

Whatever happened to Henry Wheeler of Bath: tailor, naval signalman, and Desert Island Discs castaway?

Lately, I’ve been writing lots of Wikipedia biographies of people who have been “castaways” on the BBC Radio programme, Desert Island Discs.

A desert island. Probably not in the North Sea
Photo by Ronald Saunders, on Flickr, CC-BY

Of all the varied people — priests, writers, musicians and others —  that I’ve written about, one above all has intrigued me. Because I’ve found out less about him than any other, even though I have a full transcript of the programme.

That person is Henry Wheeler.

As I wrote on Wikipedia:

Henry Wheeler (born 1924 or 1925) was a naval signalman during World War II. The eldest in a family of six, he was from from Vernhan Grove in Bath, England, where his civilian role was as a tailor’s assistant.

He joined the Royal Navy in 1943, undertook his naval training at HMS Impregnable, went to France on the day after D-Day, and was later stationed in Rotterdam. While in Rotterdam, he had a romantic relationship with a Dutch woman, named Dine.

Shortly after the war’s end, he appeared as a “castaway” on the BBC Radio programme Desert Island Discs, on 24 November 1945, at the age of 20. He was chosen to appear as he was serving on an unspecified “small island off the European coast” — the nearest thing available to a real castaway.

And that is pretty much all anybody seems to know about him. Was he real, or a propaganda fiction, or perhaps using a pseudonym? Did return home to Bath to resume tailoring? Or did he return to marry Dine, in Rotterdam? Does anyone at Vernham Grove remember his family? Are his descendants still alive in Bath, Rotterdam or elsewhere? Indeed, is he?

by Andy Mabbett at August 28, 2014 03:02 PM

August 27, 2014

Wikimedia Foundation

Reimagining Mentorship with the Wikipedia Cooperative

I JethroBT, Project Manager of the Co-op, at Wikimania 2014.

An editor’s initial experience when contributing to Wikipedia can be daunting: there is a ton to read and it’s easy to make mistakes right off the bat and feel pushed away when edits are reverted. My name is Jethro and a small team of editors and I are addressing these issues by building a mentorship space called the Wikipedia Cooperative, or simply the Co-op. In the Co-op, learners (i.e. editors seeking mentorship) will have the chance to describe how they want to contribute to Wikipedia and subsequently be matched with mentors who can teach them editing skills tailored to their goals.

We are working under an Individual Engagement Grant and hope to complete a pilot and analysis of our mentorship space by early next year. If successful, we hope to fully open the space and provide tools to allow similar projects to be built in other Wikipedia projects. We recently passed the second month of our grant and I wanted to share our progress with you thus far.

Ambidextrie.svg

We recently brought Dustin York to our team as our graphic designer. York’s background designing the WMF’s Travel and Participation Support grantmaking pages and other experience such as with UNICEF will be invaluable to us. He has begun exchanging ideas in hopes that the design work will be in full swing by September. We intend to make the space friendly and inviting for both learners and mentors alike and are confident that we can create a promising look and feel.

Product/Interaction Designer Dustin York’s illustration work for the WMF’s Travel & Participation Support grants pages on Meta.

In program development, we’ve organized an editing curriculum that we hope to make available to learners as part of the mentorship. We’ve categorized these skills into three different levels of difficulty as well as by skill type (see example). We’ve also finalized a conceptual design for how learners will be matched with mentors.

Example skills planned to be made available at the Co-op.

In our research, we’ve finished designing interview protocols and questions for editors who have participated in help spaces on Wikipedia, such as the Teahouse and The Wikipedia Adventure. We have started reaching out to such editors for interviews – their feedback will help guide our upcoming design decisions.

We have narrowed down key questions we want answered which we will use to help us understand the impact of our project:

  • How well does the Co-op work?
  • What predicts how well the Co-op works for particular learners?
  • What features work best in various existing programs?
  • Why do learners seek out and continue mentorship?

We also completed background research in addition to a preliminary mentor survey to assess how and why editors participate in mentoring. We have published our key findings on our hub on the English Wikipedia.

Lastly, our team was well-represented at Wikimania 2014 in London. We met often, sought out prospective programming candidates and connected with a number of editors and Foundation staff to discuss feedback and ideas for our project.

We plan to begin our pilot in early December and are seeking out editors who are interested in mentoring a small number of learners during this pilot period. If you are interested, please let us know on our project talk page or contact me directly. We believe that mentorship is a positive and personalized way to promote good editing habits for editors in addition to engaging productively with the editing community. It is our hope that our efforts, along with those of the mentors, will create a more approachable atmosphere for users who want to contribute to Wikipedia.

This article was co-authored by Soni and IJethroBT

by carlosmonterrey at August 27, 2014 09:41 PM

Mark Rauterkus

Wikimedia UK

Does Wikimania save lives?

This post was written by Fabian Tompsett, Wikimedian and co-ordinator of the Wikimania support team, and originally published here.

Yes it was quite a surprise to find myself with other Wikimedians back in September 2008

I am just coming to the end of a four-month stint working for Wikimedia UK helping to deliver Wikimania 2014 at London’s Barbican Centre. It was all quite exciting and as The Signpost put it was “not too bad, actually”. In the whirl of events seeing dozens of hackers bringing hacking home to Hackney, hunched over their laptops, while other devotees were busy tweeting, it became all too easy to miss some key aspects of the event, and so to fail to recognise that Wikimania contributed to saving lives.

Wikipedia is not just a website, it is also a somewhat heterogeneous international community which thrives on face-to-face encounters in meatspace. For myself my involvement gained an extra dimension when I started attending the regular London Meetups six years ago. It was meeting other human beings rather than tapping away while staring at a computer screen which made it interesting.

So, this August the London Meetup page modestly subsumes Wikimania within its calendar of monthly events, within an expansion to a three day event with between 2,000 and 4,000 attendees (so much for “British understatement“). But in essence it is the face-to-face interactions outside the formal sessions which make Wikimania such a powerful event. I don’t want to be dismissive about the formal sessions and all the hard work which went into them, it is just that I want to focus on the other aspects and use this to show why I believe Wikimania saves lives.

2014 West Africa Ebola virus outbreak situation map

A couple of weeks after Wikimania a discussion opened up on the Wikimedia Ghana list which spoke of an initiative by Carl Fredrik Sjöland of the Wikipedia:WikiProject Medicine who have teamed up with Translators Without Borders to set up a Translation taskforce. As they explained a couple of years ago “We believe that all people deserve high quality healthcare content in their own language.” Faced with the current Ebola outbreak in West Africa the focus of these activities has shifted to finding  people to translate information about Ebola into the relevant indigenous languages. There is something similar happening through the Humanitarian OpenStreetMap Team who have also been very active developing mapping resources for the medics on the ground.

I had hoped to make it to the OpenStreetMap 10th Birthday Party (the London celebrations were held nearby, to coincide with Wikimania) but I got caught up in other things and only arrived after most of the people had left. But that was precisely what Wikimania was like: you find out more and more about it in the aftermath.

Graph indicating the comparative amount of Wikipedia content available for readers in different circumstances. Nearly all indigenous languages in Africa are comparable to Gujarati.

Another aspect I found out afterwards was Denny’s comments on A new metric for Wikimedia where he discusses the availability of Wikipedia in different languages. Considering the recent Ebola outbreak above, this is not just a “nice idea”, but something which requires support now. Often it is not so much getting hold of finances, but finding a way in which those people with the relevant language skills can be linked up with and given the resources to make things happen.

An important aspect of this is that the speakers of these languages are not just passive recipients of knowledge generated in the geographical north. They can also contribute their own knowledge. This also touches on the notion of cognitive justice  as developed by Shiv Visvanathan in The search for cognitive justice

Cognitive justice is not a lazy kind of insistence that every knowledge survives as is, where is. It is an idea which is actually more playful in the sense the Dutch historian Johann Huizinga suggested when he said play transcends the opposition of the serious and the non-serious. Play seeks encounters, the possibilities of dialogue, of thought experiments, a conversation of cosmologies and epistemologies. A historical model that comes to mind is the dialogue of medical systems, where doctors once swapped not just their theologies but their cures. As A. L. Basham put it, the dialogue of medicines, each based on a different cosmology, was never communal or fundamentalist. It recognized incommensurability but allowed for translation.

This is a viewpoint which has been taken up in what is called Open ICT for Development, where “openness” is understood to include the the participation of communities in the governance of their own lives.

So what I found out in the aftermath of Wikimania is the question: Does Wikimania save lives? Can it help people get together and come up with practical methods by which people get in touch and existing initiatives can find that they are taken to a higher level? Will it have an affect in this example and save lives? So in this sense Wikimania is not over. It’s legacy depends on what action people take in its aftermath.

So I am writing this blog because I want you to see if there is something you can do to help either the Humanitarian OpenStreetMap Team or the Translation taskforce find more support for their projects in fighting Ebola.

by Richard Nevell at August 27, 2014 11:48 AM

Jeroen De Dauw

SoCraTes 2014

socrates

Last week I attended SoCraTes 2014, the 4th International Software Craftsmanship and Testing Conference in Germany.

Since this was the first time I went there, I did not really know what to expect, and was slightly apprehensive. Turns out there was no need for that, the conference was the most fun and interesting I’ve been to recently, and definitely the most motivating as well.

What made it so great? Probably the nicest aspect of the conference where the people attending. Basically everyone there was passionate about what they were doing, interested in learning more, open minded, and respectful of others. This combined with the schedule being purely composed out of sessions people proposed at the start of the day made the whole atmosphere very different from that of your typical commercial conference. Apart from attending sessions on various topics, I also did some pair programming and played my first set of beachvolleyball games at a software conference.

I’m definitely going back next year!

by Jeroen at August 27, 2014 09:42 AM

Gerard Meijssen

#Wikipedia - Professor Hermann Buhl "Leichtathlet"

Mr Buhl died in Tirol wandering through the Alps.  He used to be an athlete of repute and became a professor at the Julius Maximilians-Universiteit.

It is obvious that Mr Buhl was a professor because of his presence in a category. It is not obvious in the same way where he studied and what he taught. When you read the text, it expects a lot of knowledge about the DDR for the text to make such things obvious.

Every Wikipedia has its notability criteria and, the German Wikipedia is not different. Mr Buhl is certainly notable as an athlete but his career did not end. He is probable notable as well for the "latter" part of his career. Some would argue that he started to contribute in a meaningful way when he taught in university.
Thanks,
      GerardM

by Gerard Meijssen (noreply@blogger.com) at August 27, 2014 09:28 AM

Tony Thomas

Using puppet realm switch to select between beta/prod ( Wikimedia clusters )

Since the BounceHandler extension is currently installed only in the beta clusters ( Official testing servers of Wikimedia- deployment.wikimedia.beta.wmflabs.org ), writing a custom router in the exim configs of operations/puppet ( configuration repo managed by puppet ) to collect in all the bounce emails and HTTP POST to the extension API seemed risky. This was […]

by Tony Thomas at August 27, 2014 07:12 AM

Exim: Creating and using Macros

The topic looks easy, but implementing them was a great learning experience, as I found it. Macros helps to make reuse a lot of code, and make the exim configuration look tidy. In the earlier post, I scribbled how to define an Exim regex to capture all VERPed emails as : Tidying this up a […]

by Tony Thomas at August 27, 2014 06:47 AM

August 26, 2014

Wikimedia Tech Blog

Content Translation: 100 published articles, and more to come!

On July 17, 2014, the Wikimedia Language Engineering team announced the deployment of the ContentTranslation extension in Wikimedia Labs. This first deployment was targeted primarily for translation from Spanish to Catalan. Since then, users have expressed generally positive feedback about the tool. Most of the initial discussion took place in the Village pump (Taverna) of the Catalan Wikipedia. Later, we had the opportunity to showcase the tool to a wider audience at Wikimania in London.

Initial response

In the first 2 weeks, 29 articles were created using the Content Translation tool and published in the Catalan Wikipedia. Article topics were diverse, ranging from places in Malta, to companies in Italy, a river, a monastery, a political manifesto, and a prisoner of war. As the Content Translation tool is also being used for testing by the developers and other volunteers, the full list of articles that make it to a Wikipedia is regularly updated. The Language Engineering team also started addressing some of the bugs that were encountered, such as issues with paragraph alignment and stability of the machine translation controller.

The number of articles published using Content Translation has now crossed over 100 and its usage has not been only limited to Catalan Wikipedia. Users have been creating articles in other languages like Gujarati and Malayalam, although machine translation has not been extended beyond Spanish−Catalan yet. All the pages that were published as articles had further edits for wikification, grammar correction, and in some cases meaningful enhancement. A deeper look at the edits revealed that the additional changes were first made by the same user who made the initial translation, and later by other editors or bots.

Wikimania in London

Amir Aharoni of the Wikimedia Language Engineering team introduces the Content Translation tool to the student delegation from Kazakhstan at Wikimania 2014, in London.

Amir Aharoni of the Wikimedia Language Engineering team introduces the Content Translation tool to the student delegation from Kazakhstan at Wikimania 2014, in London.

The Content Translation tool was showcased widely at Wikimania 2014, the annual conference of the Wikimedia communities. In the main conference, Santhosh Thottingal and Amir Aharoni presented about machine aided translation delivery through Content Translation. During the pre-conference hackathon, Pau Giner conducted a testing session with student volunteers from Kazakhstan, who were enthusiastic about using the tool in their local Wiki Club. Requests for fully supporting other language pairs were brought up by many users and groups like the Wikipedia Medical Translation project. Discussions were held with the Wikidata team to identify areas of collaboration on data reuse for consistent referencing across translated versions. These include categories, links etc.

The Language Engineering team members worked closely with Wikimedians to better understand requirements for languages like Arabic, Persian, Portuguese, Tajik, Swedish, German and others, that can be instrumental in extending support for these languages.

Further development

The development of ContentTranslation continues. Prior to Wikimania, the Language Engineering team met to evaluate the response and effectiveness of the first release of the tool, and prepared the goals for the next release. The second release is slated for the last week of September 2014. Among the features planned are support for more languages (machine translation, dictionaries), a smarter entry point to the translation UI, and basic editor formatting. It is expected that translation support from Catalan to Spanish will be activated by the end of August 2014. Read the detailed release plan and goals to know more.

Over the next couple of months, the Language Engineering team intends to work closely with our communities to better understand how the Content Translation tool has helped the editors so far and how it can serve the the global community better with the translation aids and resources currently integrated with tool. We welcome feedback at the project talk page. Get in touch with the Language Engineering team for more information and feedback.

Amir Aharoni and Runa Bhattacharjee, Language Engineering, Wikimedia Foundation

by Guillaume Paumier at August 26, 2014 12:34 PM

Wikimedia engineering report, July 2014

Major news in July include:

Note: We’re also providing a shorter and translatable version of this report.

Engineering metrics in July:

  • 164 unique committers contributed patchsets of code to MediaWiki.
  • The total number of unresolved commits went from around 1575 to about 1642.
  • About 31 shell requests were processed.

Personnel

Work with us

Are you looking to work for Wikimedia? We have a lot of hiring coming up, and we really love talking to active community members about these roles.

Announcements

  • Arthur Richards is now Team Practices Manager (announcement).
  • Kristen Lans joined the Team Practices Group as Scrum Master (announcement).
  • Joel Sahleen joined the Language Engineering team as Software Engineer (announcement).

Technical Operations

Dallas data center

Throughout July, the cabling work of all racked servers and other equipment was nearly completed. We’re still awaiting the installation of the first connectivity to the rest of our US network in early August before we can begin installation of servers and services.

San Francisco data center

Due to a necessary upgrade to power & cooling infrastructure in our San Francisco data center (which we call ulsfo), our racks have been migrated to a new floor within the same building on July 9. The move completed in a very smooth fashion without user impact, and the site was brought back online serving all user traffic again in less than 24 hours.

PFS enabled

Through the help of volunteer work and research, our staff enabled Perfect Forward Secrecy on our SSL infrastructure, significantly increasing the security of encrypted user traffic.

Labs metrics in July:

  • Number of projects: 173
  • Number of instances: 464
  • Amount of RAM in use (in MBs): 1,933,824
  • Amount of allocated storage (in GBs): 20,925
  • Number of virtual CPUs in use: 949
  • Number of users: 3,500

Wikimedia Labs

We’ve made several minor updates to Wikitech: we added OAuth support, fixed a few user interface issues, and purged the obsolete ‘local-*’ terminology for service groups.
OPW Intern Dinu Sandaru has set forms for structured project documentation. This should will help match new volunteers with existing projects, and will make communication with project administrators more straightforward.
Sean Pringle is in the process of updating the Tool Labs replica databases to MariaDB version 10.0. This may reduce replag, and should improve performance and reliability.
We’re setting up new storage hardware for the project dumps. This will resolve our ongoing problems with full drives and out-of-date dumps.

Features Engineering

Editor retention: Editing tools

VisualEditor

In July, the team working on VisualEditor converged the design for mobile and desktop, made it possible to see and edit HTML comments, improved access to re-using citations, and fixed over 120 bugs and tickets.

The new design, with controls focussed at the top of each window in consistent positions, was made possible due to the significant progress made in cross-platform support in the UI library, which now provides responsively-sized windows that can work on desktop, tablet and phone with the same code. HTML comments are occasionally used on a few articles to alert editors to contentious or problematic issues without disrupting articles as they are read, so making them prominently visible avoids editors accidentally stepping over expected limits. Re-using citations is now provided with its simple dialog available in the toolbar so that it is easier for users to find.

Other improvements include an array of performance fixes targeted at helping mobile users especially, fixing a number of minor instances where VisualEditor would corrupt the page, and installing better monitoring of corruptions if they occur, and better support for right-to-left languages, displaying icons with the right orientation based on context.

The mobile version of VisualEditor, currently available for beta testers, moved towards stable release, fixing a number of bugs and editing issues and improving loading performance. Our work to support languages made some significant gains, nearing the completion of a major task to support IME users, and the work to support Internet Explorer uncovered some more issues as well as fixes. The deployed version of the code was updated five times in the regular release cycle (1.24-wmf12, 1.24-wmf13, 1.24-wmf14, 1.24-wmf15 and 1.24-wmf16).

In wider news, the team expanded its scope to cover all MediaWiki editing tools as well, as the new Editing Team (covered below).

Editing

In July, the newly re-named and re-scoped Editing Team was formed from the VisualEditor Team. We are responsible for extending and improving the editing tools used at Wikimedia – primarily VisualEditor and maintenance for WikiEditor. We exist to support new and existing editors alike; our current work is mostly on desktop, and we are working with Mobile to take responsibility for all editing across desktop, tablet and phone platforms, spanning approximately 50 different areas of MediaWiki and extensions related to editing. We will continue to report progress on VisualEditor separately.

The biggest Editing change this month was in the Cite extension (for footnotes) – this now automatically shows a references list at the end of the page if you forget to put in a <references /> tag, instead of displaying an ugly error message. The Math extension (for formulæ) was improved with more rigorous error handling and LaTeX formula checking, as part of the long-term volunteer-led work to introduce MathML-based display and editing. The TemplateData GUI editor was deployed to a further six wikis – the English, French, Italian, Russian, Finnish and Dutch Wikipedias.

A lot of work was done on libraries and infrastructure for the Editing Team and others. The OOjs UI library was extensively modified to bring in a new window management system for comprehensive combined desktop, tablet and phone support, as well as other updates to improve Internet Explorer compatibility and accessibility of controls. In the next few months the team will continue working on OOUI to support other teams’ needs and implement a consistent look-and-feel in collaboration with the Design team. The OOjs library was updated to fix a minor bug, with a new version (v1.0.11) released and pushed downstream into MediaWiki, VisualEditor and OOjs UI. The ResourceLoader framework was extended to allow skins to set the “skinStyles” property themselves, rather than rely on faux dependencies, as part of wider efforts led jointly by a volunteer and a team member to improve MediaWiki’s skin support.

Parsoid

In July, the Parsoid team continued with ongoing bug fixes and bi-weekly deployments.

With an eye towards supporting Parsoid-driven page views, the Parsoid team strategized on addressing Cite extension rendering differences that arise from site-messages based customizations and is considering a pure CSS-based solution for addressing the common use cases. We also finished work developing the test setup for doing mass visual diff tests between PHP parser rendering and Parsoid rendering. It was tested locally and we started preparations for deploying that on our test servers. This will go live end-July or early-August.

The GSoC 2014 LintTrap project continued to make good progress. We had productive conversations with Project WikiCheck about integrating LintTrap with WikiCheck in a couple different ways. We hope to develop this further over the coming months.

Overall, this was also a month of reduced activity with Gabriel now officially full time in the Services team and Scott focused on the PDF service deployment that went live a couple days ago. The full team is also spending a week at a off-site meeting working and spending time together in person prior to Wikimania in London.

Services

Services and REST API

The brand new Services group (currently Matt Walker and Gabriel Wicke) started July with two main projects:

  1. PDF render service deployment
  2. Design and prototyping work on the storage service and REST API

The PDF render service is now deployed in production, and can be selected as a render backend in Special:Book. The renderer does not work perfectly on all pages yet, but the hope is that this will soon be fixed in collaboration with the other primary author of this service, C. Scott Ananian.

Prototyping work on the storage service and REST API is progressing well. The storage service now has early support for bucket creation and multiple bucket types. We decided to configure the storage service as a backend for the REST API server. This means that all requests will be sent to the REST API, which will then route them to the appropriate storage service without network overhead. This design lets us keep the storage service buckets very general by adding entry point specific logic in front-end handlers. The interface is still well-defined in terms of HTTP requests, so it remains straightforward to run the storage service as a separate process. We refined the bucket design to allow us to add features very similar to Amazon DynamoDB in a future iteration. There is also an early design for light-weight HTTP transaction support.

Matt Walker is sadly leaving the Foundation by the end of this month to follow his passion of building flying cars. This means that we currently have three positions open in the service group, which we hope to start filling soon.

Core Features

Flow

In July, the Flow team built the ability for users to subscribe to individual Flow discussions, instead of following an entire page of conversations. Subscribing to an individual thread is automatic for users who create or reply to the thread, and users can choose to subscribe (or unsubscribe) by clicking a star icon in the conversation’s header box. Users who are subscribed to a thread receive notifications about any replies or activity in that thread. To support the new subscription/notification system, the team created a new namespace, Topic, which is the new “permalink” URL for discussion threads; when a user clicks on a notification, the target link will be the Topic page, with the new messages highlighted with a color. The team is currently building a new read/unread state for Flow notifications, to help users keep track of the active discussion topics that they’re subscribed to.

Growth

Growth

In July, the Growth team completed its second round of A/B testing of signup invitations for anonymous editors on English Wikipedia, including data analysis. The team also built the first API and interface prototypes for task recommendations. This new system, first aimed at brand new editors, makes suggestions based on a user’s previous edits.

Mobile

Wikimedia Apps

Following on from the successful launch to Android, the Mobile Apps team released the new native Wikipedia app to iOS on July 31. The app is the iOS counterpart to the Android app, with many of the same features such as editing, saving pages for offline reading, and browsing history. The iOS app also contains an onboarding screen that is shown the first time the app is launched, asking users to sign up, a feature which was also launched on Android this month (see below).

On Android this month we released to production accessibility and styling features which were requested by our users, such as a night mode for reading in the dark and a font size selector. We also released an onboarding screen that asks users to sign up.

Our plan for next month is to get user feedback from Wikimania, wrap up our styling fixes, and begin work on an onboarding screen the first time that someone taps edit.

Mobile web projects

This month, the team continued to focus on wrapping up the collaboration with the Editing team to bring VisualEditor to tablet users on the mobile site. We also began working to design and prototype our first new Wikidata contribution stream, which we will build and test with users on the beta site in the coming month.

Wikipedia Zero

During the last month, the team worked on software architecture features that allow for expansion of the Wikipedia Zero footprint on partner networks and that get users to content faster with support for lowered cache fragmentation on Varnish caches. Whereas the previous system supported one-size-fits-all configuration for heterogeneous partner networks, inhibiting some zero-rated access, the new system supports multiple configurations for disparate IP addresses and connection profiles per operator. Additionally, lightweight script and GIF-ified Wikipedia Zero banner support has been added and is being tested; in time this should drastically reduce Varnish cache fragmentation, making pages be served faster and reducing Varnish server load. A faster landing page was introduced for “zerodot” (zero.wikipedia.org, legacy text-only experience) landing pages when operators have multiple popular languages in their geography. Work on compression proxy traffic analysis for header enrichment conformance with the official Wikipedia Zero configurations was also performed after more diagnostic logging code was added to the system. Finally, watchlist thumbnails, although low bandwidth, were removed from the zerodot user experience, as was the higher bandwidth MediaViewer feature for zerodot; mdot will have these features, though.

In side project work, the team spent time on API continuation queries, Android IP editing notices, Amazon Kindle and other non-Google Play distribution, and Google Play reviews (now that the Android launch dust has settled, mobile apps product management will be triaging the reviews). In partnerships work, the team met with Mozilla to talk about future plans for the Firefox OS HTML5 app (e.g., repurposing the existing mobile website, but without any feature reduction) and how Wikimedia search might be further integrated into Firefox OS, and also spoke with Canonical about how Wikipedia might be better integrated into the forthcoming Ubuntu Phone OS.

Routine pre- and post-launch configuration changes were made to support operator zero-rating, with routine technical assistance provided to operators and the partner management team to help add zero-rating and address anomalies. The team also continued its search for a third Partners engineering teammate.

Wikipedia Zero (partnerships)

We served an estimated 68 million free page views in July through Wikipedia Zero. We continue to bring new partners into the program, though none launched in July. Adele Vrana met with prospective partners and local Wikimedians in Brazil. We published our operating principles to increase transparency.

Language Engineering

Language tools

CLDR extension was updated to use CLDR 25; this work was mostly done by Ryan Kaldari. The team made various internationalization fixes in core, MobileFrontend, Wikipedia Android app, Flow, VisualEditor and other features. In the Translate extension, Niklas Laxström fixed ElasticSearchTTMServer to provide translation memory suggestions longer than one word; and improved translation memory suggestions for translation units containing variables (bug 67921).

Language Engineering Communications and Outreach

We announced the initial availability of the Content translation tool with limited feature support. We are focusing on supporting Spanish to Catalan translations for this initial release. You can read a report on the feedback received since deployment.

Content translation

An initial version was released on Beta Labs; it supports machine translation between Spanish and Catalan. The machine translation API leverages open source machine translation with Apertium. The tool supports experimental template adaptation between languages. Numerous bug fixes were made based on testing and user feedback. We worked on matching the Apertium version to the cluster, and planning for the next round of development has started.

Platform Engineering

MediaWiki Core

HHVM

The Beta cluster is running HHVM. The latest MediaWiki-Vagrant and Labs-vagrant use HHVM by default.

Admin tools development

Most admin tools resources are currently diverted towards SUL finalisation, which will greatly help in reducing the admin tools backlog. July saw the deployment of the global rename tool (bug 14862), and core fixes including the creation of the “viewsuppressed” userright (bug 20476).

Search

Our deployment of CirrusSearch to larger wikis as the primary search back-end turned out to be too ambitious. After encountering performance issues, we rolled back this change. We are now addressing the root of the problem, by getting more servers (nearly doubling the cluster size) and putting together more optimizations to the portion of Cirrus that fell over (working set). If everything goes as planned, it’ll be reduced by about 80%, by reducing indexing performance in return of search performance. These optimizations will slightly change result relevance; please let us know if you notice any issues.

Auth systems

Most work was spent on SUL Finalization tasks. Phpunit and browser tests were added for CentralAuth, global rename was deployed, and lots of small fixes were made to CentralAuth to clean up user accounts in preparation for finalization.

SUL finalisation

In July, the SUL finalisation team began work on completing the necessary feature work to support the SUL finalisation.

To help users with local-only accounts that are going to be forcibly renamed due to the SUL finalisation, the team is working on a form that lets those users request a rename. These requests will be forwarded onto the stewards to handle. The SUL team is currently in consultation with the stewards about how they would like this tool to work. When this consultation is wrapped up, the team will begin design and implementation.

To help users get globally renamed without having to request renames on potentially hundreds of wikis, the team implemented and deployed GlobalRenameUser, a tool which renames users globally. As the tool is designed to work post-finalisation, it only performs renames where the current name is global, and the requested name is totally untaken (no global account and no local accounts exist with that name).

To help users who get renamed by the finalisation and, despite our best efforts to reach out to them, did not get the chance to request a rename before the finalisation, the team is working on a feature to let users log in with their old credentials. The feature will display an interstitial when they log in, informing them that they logged in with old credentials and that they need to use new ones. We are also considering a persistent banner for those users, so that they definitely know they need to use their new credentials. An early beta version of this feature is complete, and now needs design and product refinements to be completed.

To help users who get renamed by the finalisation and, as a result, have several accounts that were previously local-only turned into separate global accounts, the team is working on a tool to merge global accounts. We chose to merge accounts as it was the easiest way to satisfy the use case without causing further local-global account clashes that would cause us to have to perform a second finalisation. The tool is in its preliminary stages.

The team also globalised some accounts that were not globalised but had no clashes. These accounts were either created in this local-only form due to bugs, or are accounts from before CentralAuth was deployed where the user never globalised. As these accounts had no clashes, there were no repercussions to globalising these accounts, so we did this immediately.

At present, no date has been chosen for the finalisation. The team plans to have the necessary engineering work done by the end of the quarter (end of September 2014), and have a date chosen by then.

Next month the team plans to continue work on these features.

Security auditing and response

MediaWiki 1.23.2 was released, fixing 3 security bugs. Security reviews were made for BounceHandler and Petition extensions, and the password API was merged.

Release Engineering

Release Engineering

This month, the Release and QA Team became the Release Engineering Team, mostly reflecting the transition of this team from being made up of members of other distinct teams to that of a coherent self-contained (mostly) team. This will, hopefully, allow better coordination of “Release” and “QA” things (broadly spreaking).

A lot of progress was made on making Phabricator suitable as a task/bug tracking system for Wikimedia projects. You can see the work to be sorted and completed at this workboard.

The Beta Cluster now runs with HHVM, bringing us much closer to full HHVM deployment. In addition, the Language Team deployed the new Content translation system on the Beta Cluster with the help of the Release Engineering team.

The second round of public RFP for third-party MediaWiki release management was conducted and concluded.

We now no longer use the third-party Cloudbees service for any of our Jenkins jobs and run all jobs locally. This will enable us to better diagnose issues with our build process, especially as it pertains to our browser tests (which still mostly run on SauceLabs).

Quality Assurance

This month, the QA team finished two significant achievements: after porting all the remaining browser tests from the browsertests repository to the repositories of the extensions being tested in June, as well as porting a significant set of tests to MediaWiki core itself, we completely retired the Jenkins instance running on a third-party host in favor of running test builds from the Wikimedia Jenkins instance, and we deleted the /qa/browsertests code repository. These moves are the result of more than two years of work. In addition, we have added more functions to the API wrapper used by browser tests, improved support for testing in Vagrant virtual machines, added new Jenkins builds for extensions, and improved the function of the beta labs test environments by preventing database locks and stopping users from being logged out by accident.

Browser testing

The browser tests are now all integrated with builds on the Wikimedia Jenkins host. We added browser tests for MediaWiki core that will validate the correctness of a MediaWiki installation regardless of language, or of what extensions may or may not exist on the wiki, so that the tests may be packaged with the distribution of MediaWiki itself and used on arbitrary wikis. We saw a lot of browser test activity for Flow development, and we are preparing to support even more extensions and features in the very near future.

Multimedia

Multimedia

Media Viewer’s new ‘minimal design’.

In July, the multimedia team reviewed more feedback about Media Viewer, from three separate Requests for Comments on the English and German Wikipedias, as well as on Wikimedia Commons. Based on this community feedback, the team worked to make the tool more useful for readers, while addressing editor concerns. We are now considering a new ‘minimal design’, which would include: a much more visible link to the File: page; an even easier way to disable the tool; a caption or description right below the image; removing additional metadata below the image, directing users to the File: page instead.

As described in our improvements plan, these new features are being prototyped and will be carefully tested with target users in August, so we can validate their effectiveness before developing and deploying them in September. You can see some of our thinking in this presentation.

This month, we continued to work on the Structured Data project with the Wikidata team and many community members, to implement machine-readable data on Wikimedia Commons. We prepared to host a range on online and in-person discussions to plan this project with our communities, and aim to develop our first experiments in October, based on their recommendations. We also continued a major code refactoring for the UploadWizard, as well as fixed a number of bugs for some of our other multimedia tools.

Last but not least, we prepared seven different multimedia roundtables and presentations for Wikimania 2014, which we will report on in more depth in August. For now, you can keep up with our work by joining the multimedia mailing list.

Engineering Community Team

Bug management

At the Pywikibot bugdays, 189 reports received updates. Technically, Jan enabled invalidating the CSS cache and strict transport security, Matanya updated Bugzilla’s cipher_suite and cleaned up a template, and Daniel deleted an unused config file. Tyler and Andre added requested components to Bugzilla. Planning of an exposed “easy bug of the week” continued, summarized on a wikipage.

Phabricator migration

Phabricator’s “Legalpad” application (a tool to manage trusted users) was set up on a separate server. This instance provides WMF Single-User Login authentication.

Mukunda implemented restricting access to tasks in a certain project which can be tested on fab.wmflabs.org. As a followup, he investigated enforcing security policy also on files and attachments and replacing the IRC bots by Phab’s chatbot. Chase worked on initial migration code to import data from Bugzilla reports into Phabricator tasks (and ran into missing API code in Phabricator), investigated configuring Exim for mail, set up a data backup system for Phabricator, and upgraded the dedicated Phabricator server to Ubuntu Trusty. Quim started documenting Phabricator.

Andre helped making decisions on defining field values and how to handle certain Bugzilla fields in the import script and sent a summary email to wikitech-l about the Phabricator migration status.

Mentorship programs

All Google Summer of Code and FOSS Outreach Program for Women projects continued their development toward a successful end. For details, check the reports:

Technical communications

Chart showing historical Flesch reading ease data for Tech News, a measure of the newsletter’s readability. Higher scores indicate material that is easier to read. A score of 60–70 corresponds to content easily understood by 13- to 15-year-old students.

Guillaume Paumier collaborated with authors of the Education newsletter to set it up for multilingual delivery, using a script similar to the one used for Tech News. He also wrote a detailed how-to to accompany the script for people who want to send a multilingual message across wikis. In preparation for the Wikimania session about Tech News, he updated the readability and subscribers metrics. He also continued to provide ongoing communications support for the engineering staff, and to prepare and distribute Tech News every week.

Volunteer coordination and outreach

We focused on the preparation of the Wikimania Hackathon, encouraging all registered participants to propose topics and sign up to interesting sessions. We also organized a Q&A session with potential organizers of the Wikimedia Hackathon 2015. We organized two Tech Talks: Hadoop and Beyond. An overview of Analytics infrastructure and HHVM in production: what that means for Wikimedia developers. More activities hosted in July can be found at Project:Calendar/2014/07.

Architecture and Requests for comment process

Developers finished the security architecture guidelines, and discussed several requests for comment in online architecture meetings:

dev.wikimedia.org

In July, Quim Gil sorted the tasks necessary for the first hub prototype into a Phabricator board, and Sumana Harihareswara determined which three APIs she would document first.

Analytics

Wikimetrics

Wikimetrics can now generate vital sign metrics for every project daily. Rolling Monthly Active Editor metric has been implemented; the reports are in JSON format, in a logical path hosted on a file server and downloadable. The team also worked on backfilling data for the daily reports on Newly Registered and Rolling Active Editor, and numerous optimizations to backfill the data quickly.

Data Processing

New nodes were added to the cluster this month and all machines were upgraded to run CDH5. The team decided not to preserve any data on the cluster during the upgrade and started fresh. The team hosted a Tech Talk on our Hadoop installation (see video and slides). Duplicate monitoring has also been implemented in Hadoop to monitor the incoming Varnish logs.

Editor Engagement Vital Signs

The culmination of our efforts this month can be visualized in a prototype built for Wikimania. This was made possible thanks to many back-end enhancements (optimizations) to Wikimetrics, along with research and selection of the optimal technologies to implement the stack to display a dashboard.

EventLogging

EventLogging monitoring is now in graphite, and we can see which schemas cause spikes in traffic (example).

Research and Data

This month, we completed the documentation for the Active Editor Model, a set of metrics for observing sub-population trends and setting product team goals. We also engaged in further work on the new pageviews definition. An interim solution for Limited-duration Unique Client Identifiers (LUCIDs) was also developed and passed to the Analytics Engineering team for review.

We analyzed trends in mobile readership and contributions, with a particular focus on the tablet switchover and the release of the native Android app. We found that in the first half of 2014, mobile surpassed desktop in the rate at which new registered users become first-time editors and first-time active editors in many major projects, including the English Wikipedia. An update on mobile trends will be presented at the upcoming Monthly Metrics meeting on July 31.

Development of a standardised toolkit for geolocation, user agent parsing and accessing pageviews data was completed.

We supported the multimedia team in developing a research study to objectively measure the preference of Wikipedia editor and readers.

We hosted the July research showcase with a presentation by Aaron Halfaker of 4 Python libraries for data analysis, and a guest talk by Center for Civic Media’s Nathan Matias on the use of open data to increase the diversity of collaboratively created content.

We prepared 8 presentations that we will be giving or co-presenting next week at Wikimania in London. We also organized the next WikiResearch hackathon that will be jointly hosted in London (UK) (during the pre-conference Wikimania Hackathon) and in Philadelphia (USA) on August 6-7, 2014.

We filled the fundraising research analyst position: the new member of the Research & Data team will join us in September and we’ll post an announcement on the lists shortly before his start date.

Lastly, we gave presentations on current research at the Wikimedia Foundation at the Institute for Scientific Interchange (Turin) and at the DesignDensity lab (Milan).

Kiwix

Screenshot of the first Project Gutenberg ZIM file

The Kiwix project is funded and executed by Wikimedia CH.

We have pre-release binaries of the next 0.9 (final) release. Except for OSX everything seems to work file as far. The support of RaspberryPi was finally merged to the kiwix-plug master branch; this offers new perspectives because the price to create a Kiwix-Plug has dropped to around USD 100. We also started an engineering collaboration with ebook reader manufacturer Bookeen (in the scope of the Malebooks project) to be able offer an offline version of Wikipedia on e-ink devices.
We participated in the Google Serve Day at Google Zurich. The goal was to meet Google engineers during one day and have them work on open source projects. The result was a dozen of fixed bugs and implemented features, mostly on Kiwix for Android, but also in Kiwix for desktop and MediaWiki.
Four developers had a one-week hackathon in Lyon, France to develop an offline version of the Gutenberg library. We’re currently polishing the code and plan a release soon; our partners and sponsors plan the first deployments in Africa in Autumn.
Last but not least, a proof-of-concept of a Kiwix iOS app was made, so we might release a first app before the end of the year.

Wikidata

The Wikidata project is funded and executed by Wikimedia Deutschland.

The biggest improvement around Wikidata in July is the release of the entity suggester. It makes it a lot easier to see what kind of information is missing on an item. Helen and Anjali, Wikidata’s Outreach Program for Women interns, continued improving user documentation and outreach around Wikidata as well as worked on a new design for the main page. Guided Tours were published, helping newcomers find their way around the site. The developers further worked on supporting badges (like “featured article”), redirects between items, the monolingual text datatype (to be able to express things like the motto of a country) as well as the first implementation steps for the new user interface design. Additionally the first JSON dumps were published.

Future

The engineering management team continues to update the Deployments page weekly, providing up-to-date information on the upcoming deployments to Wikimedia sites, as well as the annual goals, listing ongoing and future Wikimedia engineering efforts.

This article was written collaboratively by Wikimedia engineers and managers. See revision history and associated status pages. A wiki version is also available.

by Guillaume Paumier at August 26, 2014 10:09 AM

August 25, 2014

Harry Burt

Wikimania review

Wikimania 2014 was held earlier in the Barbican Centre in London. This particular article or mine was originally published in the Signpost, where it had about 1500 page views.

Prologue: hackathon

The pre-Wikimania Hackathon proved popular, with developers flooding the fourth floor for its introductory session.

As has become traditional, Wikimania proper was preceded by a two-and-a-half day hackathon, with entry at slight additional cost. While there had been concerns from hackathon organisers about what percentage of those registered would actually attend, it was clear from the word “go” that it would be alright on the night: the introductory session on Wednesday morning was packed, and numbers remained high throughout Thursday and into Friday. For attendees it was an opportunity to get in some ‘hacking’—any coding of an interesting nature, including work on tools, gadgets, MediaWiki and its extensions—meet other developers, and enjoy the comfortable (if slightly unusual) surroundings of the Barbican’s tropical conservatory and garden room. On a warm summer’s day, it felt like a greenhouse—not least because, in a very real sense, it was.

Nevertheless, the social atmosphere was Wikimania at its best: light, enthusiastic and welcoming to those more unfamiliar with the movement and its goals, here including an impressive assortment of journalists. Staff proved approachable, mixing freely with volunteers—indeed, the sessions served as a reminder that Wikimedians are peculiarly lucky in that regard. Such positivity even crept into sessions as potentially fraught as that led by the Foundation’s Fabrice Florin, a presentation and chat about the development direction of the controversial Media Viewer extension. Although there were minor quibbles, like the sprawling Barbican making it difficult to move from registration (floor: -1) to venue (floor: 4), or the deployment of sandwiches at lunch (“originally supposed to be lasagne”, Ed noted critically) and nothing at dinner, it was an uncomplicated unconference executed well. Even the WiFi held up, as it did throughout the conference—more or less.

Opening session and keynotes

Conference Organiser Ed Saperia opened Wikimania proper with a brief discussion of its main themes and their inspirations.

The opening session of Wikimania, held alongside a welcome drinks reception on the Thursday evening, could roughly be divided into two halves. The first consisted of four speakers (Ed Saperia, Wikimedia UK Chief Executive Jon Davies, Jimmy Wales and Lila Tretikov) enlisted to give short welcome speeches. Apart from an off-the-cuff remark from Wales that he wished the press would talk “less about the monkey” and more about the substantive issues raised in his pre-Wikimania press conference, the burden of getting the packed auditorium to tear themselves away from their phones/tablets/buzzword bingo cards fell to Salil Shetty, Secretary General of Amnesty International and sole keynote speaker of the Thursday evening session. Though many of Shetty’s remarks fell on sympathetic ears, it was his allusions to certain problems of scaling—the forced creation of staff headquarters in developing nations; the difficulties of running a global institution alongside local chapters—which stood out and it was a shame that Shetty did not share more of his considerable experience during the keynote itself.

Salil Shetty provided the opening keynote of Wikimania 2014, discussing the development and growth of Amnesty International, which he heads.

Shetty was arguably the most prominent of the non-Wikimedia names on the list of featured speakers—surprising, perhaps, for a conference that had won the bidding process promising speakers including Clay Shirky, Cory Doctorow, Lawrence Lessig and even Stephen Fry (see related Signpost coverage). Nevertheless, the speakers eventually organised proved sufficient to regularly fill and continuously entertain the cavernous Barbican Hall. The final lineup thus included Danny O’Brien (along with Wales, one of the two survivors of the original London bid), Jack Andraka, and, able to draw on the UK’s well developed civil society infrastructure, representatives of the thinktank Demos, Code Club and Young Rewired State among others: an admirable and effective lineup, if not quite the “VIP speakers (academics, politicians, media, entertainment)” originally described by Jimmy Wales in July 2012. In a Wikimania first, all of the featured speakers’ presentations were reliably streamed live and recordings rapidly made available online, a real boon considering Wikimedia’s global appeal and the months-long delays from previous Wikimanias.

Other tracks

Wikimania 2014′s eight tracks offered access to speakers on a wide variety of subjects—here, author and associate professor of journalism Andrew Lih discusses the difficulties of getting more video onto Wikimedia wikis.

In total, Wikimania 2014 claimed some 200 sessions over 8 simultaneous tracks, replete with the inevitable scheduling and organisational headaches. The organisers will be pleased with the variety they achieved: notable themes including open access, open data, technology, GLAM and diversity were all well-represented, while smaller topics (the legal aspects of Wikipedia, for example) seemed neatly stitched into accessible 90 minute blocks. The Barbican’s cavernous layout and the comfort of its designed-for-purpose auditoria thus conspired to make these blocks, rather than individual sessions, the primary unit of time management—to the benefit of some of the more niche interest talks on the programme. Each talk seemed ably staffed by the conference’s apparently vast team of volunteers, both technically and in terms of sticking to their timetables. The blocks were then in turn punctuated by coffee breaks, lunch, and on some days (but confusingly not all) dinner. Although hackathon attendees quickly got used to the “packed lunch” format, it was the dinners that particularly stood out, including bitesize burgers, skewers and sea-bream tacos (to name a few), served in reasonable quantity but alas with the purity of queuing to which many native Britons (the author included) are accustomed.

Aided by the high overall attendance (an estimated 2000, making London the largest Wikimania to date) all the sessions seemed to receive good levels of participation; there were not enough chairs, for example, to incorporate everyone attending an event on copyright, not usually a floor filler. Saperia added that hundreds of those tickets had been sold in the final days before the start of Wikimania proper—a reminder that it was not just hardcore Wikimedians in attendance. For those unable to attend a talk that they would have liked to—and with eights tracks, that included many attendees—slides and numerous recordings are now available. The quality of the talks varied, but around a high mean; early evidence suggests numerous standout sessions (the author would recommend Brandon Harris’ unique performance style, though his two talks were of very different kinds). Unsurprisingly many attendees also turned to Twitter to add their comments to those of a hyperactive Wikimania social media team, with an estimated 21,000 tweets using the #wikimania or #wikimania2014 hashtags over the course of the three day conference.

Closing speeches

The Wikimania 2014 group photograph, taken immediately before the closing speeches

After a brief video in support of the students of Sinenjongo High School in their WMF-supported campaign to get Wikipedia Zero more widely adopted in the Global South, Jimmy Wales once more took to the stage to give his “state of the wiki” remark. Most pertinent of these was his comment that too often what is intended as a minimum bar serves to define the normal and thus to hive off as supererogatory many of the virtues for which Wikimedia ought to strive: not just mere civility, Wales suggested, but “kindness, generosity, forgiveness, compassion”, a “morally ambitious” programme he said, but an achievable one. He also noted YouGov research that indicated the British public trusts Wikipedia more than both the tabloid and quality press.

Wales’ annual Wikimedian of the Year award went this year to Ihor Kostenko, a prominent Ukrainian Wikipedian and journalist tragically killed in the civil unrest that engulfed the capital Kiev earlier this year (see Signpost special report: “Diary of a protester—Wikimedian perishes in Ukrainian unrest“). It was a poignant and appropriate choice, although in a hat-tip to potential future controversy over the awarding of the honour, Wales promised to ensure a more “democratic” process was in place ahead of Wikimania 2015. After presenting some of the hosting chapter (Wikimedia UK)’s annual awards of their behalf, attention turned more fully to next year’s event, with a brief introductory video shown by the Mexico City team. Of its slogans, “our venue: Vasconcelos library” and “gay friendly” received the most enthusiastic support among the thousand-strong audience.

The Wikimania closing party contained its fair share of free drinks, loud music—and decidedly questionable dancing.

The speeches (including brief remarks by WMF Chair Jan-Bart de Vreede) were followed by the Wikimania closing party, an event backed by reasonable but not excessive amounts of free alcohol, and a selection of musical accompaniments in a variety of styles. Indeed, such entertainment was provided on each evening of the conference, interspersed with comedy performances on a technology theme. The latter especially was a brave choice, and the organisers will be forgiven if the jokes fell a little flat, or the dancefloor was a little empty. Patrons were also able to take advantage of the hackathon rooms—left open well into the night—or escape outside where attractive fountains punctuated the cold brutalist structure of the Barbican estate. The more adventurous tried the City of London’s wallet-busting public houses, if only for novelty value.

Epilogue: looking back

For some, the impact of Wikimania will be direct: a bustling community village featured an array of chapters eager to sign up new members, as well as a variety of non-WMF projects looking for exposure. For most, however, the effect is more subtle, subsisting in a set of renewed relationships, vague recollections and hearsay. It is difficult to see how Wikimania 2014 could have failed to impress the casual onlooker, with its sheer scale an obvious statement of intent. Of course, such a statement must also be paid for, and the debate over the financing of Wikimania, which necessarily took a backseat role for the duration of the conference, may yet cloud what should be enjoyable memories of an enjoyable Wikimania.

The same is true of the announcement, on the final day of the conference, that the WMF would be using technical measures to override local administrators on the German Wikipedia: as one European chapter member remarked, “at least it will give us something to talk about [at the closing party]“. Such worries aside, it was an impressive conference that promised the moon but had to settle for the stars.

Alternatively, in true British understatement, it was “not too bad, actually”.

by Harry at August 25, 2014 07:54 PM

Wikimedia UK

“The institutions that are loved survive”: Pat Hadley and the York Museums Trust

This post was written by Joe Sutherland

<iframe allowfullscreen="allowfullscreen" frameborder="0" height="360" src="https://www.youtube-nocookie.com/embed/QKQMWMywp8M?list=PL66MRMNlLyR6BuplUUTWvyl4_klBOZzNT" width="640"></iframe>
Pat Hadley was part way through a PhD in archaeology at the University of York in the summer of last year when he decided to leave to explore new areas in which to apply his skillset. A natural scientist with a digital background, he is interested in the “ways in which the public engages with the past”. For him, Wikipedia is an ideal platform to investigate this.

Since late 2013, he has worked as Wikimedian in Residence at York Museums Trust, helping them to share their collections through Wikipedia and its sister projects. An archaeologist of ten years, and a contributor to Wikipedia since 2011, Pat had been keen to find a museum in York interested in opening up access to their content.

In September 2013, Wikimedia UK supported the York Museums Trust, and two other institutions, in their search for a Wikimedian in Residence. The YMT is a charitable body which manages three museums, a contemporary art space, and a public gardens in the city.

“[The YMT] is a brilliant test case for the GLAM-Wiki project, because it’s almost the most typical set of museums you could possibly imagine, all in one space,” Pat explains.

Despite his background in academia, he was surprised to land the role. “I heard that the new scheme was going along, but I had no idea it would be me,” he says. Through a series of “happy accidents” he found himself looking for a project at the same time that applications were open and ended up with the job.

In his time at the YMT, Pat has run many major projects. They have ranged from training sessions for the institutions’ volunteers and staff, donations of content held by the museum to Wikimedia Commons, and a public editathon.

One of his first projects revolved around Tempest Anderson, a doctor, amateur photographer and volcanologist from 19th-century York, whose images have been retained on glass lantern slides. “The museum was planning to do a high-resolution digitisation of those anyway,” Pat explains, “and they’re public domain, so they were one of the key early collections for the project to target.”

As one of the first projects to take place during his tenure, he did face challenges during the work. “Unfortunately we only managed to get 56 images released by the end of the residency, but we got five of those used on the English, German and French Wikipedias. So we’re already beginning to make ripples across Wikipedia. Hopefully in the next few months the museum will be releasing the rest of the images.”

The work on Anderson was built upon in March 2014, in an event focused on the luminaries of historical York. “It was a nightmare to think of a theme that could bring all the collections together,” Pat says.

Pat Hadley outside the Yorkshire Museum

Pat Hadley at the Yorkshire Museum
Photo: User:Rock drum, CC-BY-SA 4.0

As such, the day allowed the improvement of a wide variety of topics on Wikipedia, ranging from natural history to fine art to archaeology. Several YMT curators presented their areas of expertise to a determined collection of sixteen participants, most of whom had never edited before.

The topics covered in the editathon included York-based artists such as Mary Ellen Best. “She was a Victorian artist who painted domestic interiors mostly in watercolour,” Pat says. “She wasn’t painting the kinds of things that were popular among Victorian artists.

“She wasn’t getting much recognition at the time, but there were a significant number of her paintings in the collections here. As a result we were able to release some of those and have some volunteers and experienced Wikipedians work together to get her a very reasonable biography, and even got a ‘Did You Know’ on the front page of Wikipedia. That was fantastic.”

Andrew Woods, curator of numismatics at YMT, had an active role during the day. He focused on the Middleham Hoard, a collection of Civil War-era coinage that was discovered in the eponymous market town in North Yorkshire. “Since we acquired it, it had lain dormant. Despite the fact it is this astonishing, very important hoard, we hadn’t done anything with it,” Andrew explains.

“It doesn’t really fit with our gallery spaces,” he adds, “so what we were really keen to do is to put it on display digitally. We had the coins imaged by a volunteer and we put those images onto Wikimedia. From there they’ve really taken off–a whole page has been written about the hoard, and they’ve been used in a number of different ways thereafter. So it’s taken a hoard that nobody really knew anything about and made it visible to so many more people.”

Overall, the partnership has led to “YMT becoming more open”, says Pat, and he argues that Wikimedia should be a key part of the missions of GLAM institutions moving forward. “They need to be connected. Somebody once said it is the institutions that are loved by everyone that survive.”

“If there are central funding cuts,” he adds, “the museums that share their collections and generate love by giving their knowledge and gardening it out… they are the ones that are going to survive through crises. They’re going to get more people supporting them in all sorts of ways.”

by Stevie Benton at August 25, 2014 03:04 PM

Gerard Meijssen

#Wikimedia - "Share in the sum of all available knowledge"

When we are to focus on the available knowledge we have to share, statistics are key. They cut the crap and focus on numbers. Given that information can be made out of data, knowing how much additional information is available that is easily understood by people who can read English is relevant. Two reports are relevant; one shows the number of links in English and, the other shows the number of labels in English [1]

At this time there are 757,967 items with an English label and without an article. This is 4,7% of the total number of items Wikidata holds. At the same time 58% of the number of items do not have a label in English.

Not having a label does not mean that we cannot provide meaningful information. The name of a Dutch or Spanish person is for instance perfectly understood; it is typically written exactly the same in English. Reasonator understands this and always presents a label anyway.

It is fairly easy to start sharing this "missing" information. It is already done in many Wikipedias. The suggestion to share more information has been put asked on all Wikipedias and  several "communities" do not think it is a good idea. In effect they prefer an inferior product providing a subset of the information that should be available to all our readers.
Thanks,
      GerardM

[1] it shows the numbers for other languages as well and, the statistics are near real time. It takes a minute for them to be presented to you.

by Gerard Meijssen (noreply@blogger.com) at August 25, 2014 09:54 AM

Tech News

Tech News issue #35, 2014 (August 25, 2014)

TriangleArrow-Left.svgprevious 2014, week 35 (Monday 25 August 2014) nextTriangleArrow-Right.svg
Other languages:
বাংলা • ‎čeština • ‎English • ‎español • ‎suomi • ‎français • ‎עברית • ‎日本語 • ‎português • ‎українська • ‎中文

August 25, 2014 12:00 AM

August 24, 2014

Jamie Thingelstad

MediaWiki LocalSettings for Farmers

I’ve been running a MediaWiki farm at thingelstad.com for a couple of years now hosting about a dozen wikis ranging from small to very large. Running a MediaWiki farm is a bit complicated and you can approach it a number of different ways. I recently pushed the settings that I use to run my farm into GitHub so others can see how I do it. The next step will be to also move the scripts that I use up, but those will be kept in another repository.

Hopefully this proves useful to others. It’s useful for me to finally have these very complicated settings (really code!) under version control.

by Jamie Thingelstad at August 24, 2014 01:08 PM

Gerard Meijssen

#Reasonator - A new metric for #Wikimedia

Denny wrote a really good article in the SignPost. It includes a "TL:DR" that I am happy to quote.
TL;DR: We should focus on measuring how much knowledge we allow every human to share in, instead of number of articles or active editors. A project to measure Wikimedia's success has been started. We can already start using this metric to evaluate new proposals with a common measure.
The point Denny makes is great; we aim to enable every human being to share in the sum of all knowledge and we should measure the extend to which we are achieving this goal. When you read the article carefully it does not say Wikipedia, it says Wikimetrics. The point Denny makes is very much that we need to focus on what it takes to bring information to people.

Presenting data that is available to us as information is what Reasonator does. It relies on what is known in Wikidata about articles that exist in any Wikipedia. To make this understood to a person, the number of available statements and the number of available labels for an item are key.

When Wikimetrics is to appreciate the potential of Wikidata and the approach Reasonator takes, it should include three bits of information;
  • the number of statements per item
  • the number of labels per language
  • how items are covered with labels in a language
With such an approach the graph will be substantially different. Not one language covers 50% of all the topics known to Wikidata and consequently the graph will show that there is much more work for us to do. It will also indicate that the amount of information that is available for a public that can read English is much larger and the amount available to people who can only read Gujarati is much less.
Thanks,
       GerardM

by Gerard Meijssen (noreply@blogger.com) at August 24, 2014 10:02 AM

August 23, 2014

Gerard Meijssen

#Wikidata - Ameyo Adadevoh a physician from #Nigeria

When a Mr Sawyer arrived in Lagos and showed symptoms of ebola, Mrs Adedevoh took control of the situation and thanks to her efforts ebola was largely contained. In the end it did not save her; as a physician in the frontline of the fight against ebola she became a victim herself.

Mrs Adadevoh is another hero of our times. When you google about ebola and Nigeria, there are two things that are of interest; sadly there are the opinion pieces that see a conspiracy in the coming of Mr Sawyer to Nigeria but more positive is the information about the efforts to contain ebola in Nigeria and what it is that you can do to become infected; personal hygiene is key.

There is a call to ensure that hospital staff are immunised. It is quite obvious that no country can really afford to lose key people like Mrs Adadevoh. It is equally obvious that all doctors and nurses who have to deal with ebola patients need to be protected. Without them containing and treating ebola is impossible.
Thanks,
    GerardM

by Gerard Meijssen (noreply@blogger.com) at August 23, 2014 11:04 PM

#Wikidata - Sheik Umar Khan a physician from Sierra Leone

Do not be mistaken. Mr Khan is a hero of our times. Mr Khan died of ebola. He was in charge of the fight to contain this awful disease.

There is a category of people who died from ebola; with currently three entries it is mercifully empty. Then again, that man who died at Lagos airport is not in there.. Probably more people who became notable because of ebola are missing as well.

It is important to recognise ebola for the threat it represents. One of the things you cannot do is run away from it. The only thing that is achieved is spreading the disease even further.
Thanks,
     GerardM

by Gerard Meijssen (noreply@blogger.com) at August 23, 2014 11:01 PM

Tony Thomas

Exim regex: Capture all VERPed emails with a given header pattern

Consider your VERP generater produces a Return-Path header of the form: bounces-testwiki-2a-nanrfx-Tn14EQZWaotS2XNn@mytesthost.com and you want a router to capture all bounce emails having this/similar as the To header. This router can serve multiple purpose like – feeding to a bounce processor, silently killing all bounces ( not intended ) or POSTing the email to an […]

by Tony Thomas at August 23, 2014 07:04 AM

Gerard Meijssen

#Wikidata - the #beta label lister


At the hackathon of #Wikimania2014, work was done on a new version of the label lister. It is a gadget that allows you to edit labels and aliases in other languages. It proved to be an indispensable tool to me. Today I learned that the new label lister is now available.

The most wonderful thing is that it became much more compact, you do not need to click as much anymore and it "just works". In the screenshot you see Mrs Bundschuh, she is a former member of the Landtag of Bavaria, and as you can see it is trivially easy to add a label in your language.

I hope that functionality like the label lister will make it into a core feature of Wikidata.
Thanks,
     GerardM

by Gerard Meijssen (noreply@blogger.com) at August 23, 2014 06:40 AM

August 22, 2014

Wikimedia Foundation

Grants, Programs and Learning: This year at Wikimania London

Grants, Programs & Learning booth in the Community Village.

Those present at this year’s Wikimania may have witnessed a different presence on behalf of the Grantmaking team. The department, formed by Grants, Learning & Evaluation and Education teams, was present in the global conference that brings together Wikimedia project programs, movement leaders and volunteers to learn from and connect with one another at the five day event. Whether at our booth in the Community Village or in the many presentations and workshops, the conversations we shared with community members from all over the world were very enriching.

We heard from more and more people interested in gathering data and working toward understanding, at a deeper level, what works and why. In this way, we are all working together towards building sustainable growth for the movement’s projects and programs; work that not only will involve new editors, but partnerships with other institutions that can help create free knowledge.

The need for sustainable growth

Learning Day notes on Logic Model.

Before the conference, we hosted a small Learning Day for leaders in our grants program to share experiences and insights, from applying evaluation to various projects – projects that might help the movement grow. Jake Orlowitz shared his game The Wikipedia Adventure, an experimental project aimed at onboarding new editors. Sandra Rientjes, the executive director from Wikimedia Nederlands (WMNL), presented her chapter’s long-term approach to programs. Wikimedia UK’s Daria Cybulska shared the Wikimedians in Residence Review to show how they have used evaluation to redesign and improve an existing program. To explore diversity, Amanda Menking talked about her experience in her research project on women and Wikipedia.

These four presentations demonstrated the wide range of experiments being conducted by the grants community. Continuing to measure and discuss evaluation can help us all discover projects that have impact and to understand if and how they can be replicated in different contexts.

The day came to a close with a lively Idea Lab Mixer and Learning Day Poster Session happy hour. This was an opportunity for grantees to showcase their work and insight gained in the past year and ignite conversations around creating new ideas to make Wikimedia even more awesome.

The need to learn from each other

Jake Orlowitz during his lightning talk.

For the first time, all the representatives from our grantmaking committees got together for training and impact discussions. The pre-conference sessions also hosted a special day to welcome new FDC members and discuss Participatory Grantmaking. Guest speaker Matthew Hart shared with the group his research on how this practice takes place and what benefits it has on donors, communities and movements. What does it mean to give a Wikimedia grant and work together in a project? Under the light of the recent Impact Reviews developed by Learning & Evaluation team, that focused on Annual Plan Grants and Projects and Event Grants, three main priorities were highlighted with regards on working towards the movement’s goals: expanding reach, generating more participation and improving quality.

The Wikimedia movement is known for its capacity to innovate and learn from peers. We are now at a point when we need to standardize learning processes and generate resources that guarantee this knowledge exchange. As we continue working on program resources, we will also start working more closely with grantees on their project evaluation plans, hopefully reducing the time invested in this task and increasing impact.

The need for better tools and resources!

Wikimania was a great place to share new tools, resources and strategies around shared programs. Some highlights include:

  • Category Induced: Allows you to know how many categories were created from a collection category.
  • Easy FDC report: a tool that lets you gather the number of uploaders, files uploaded and highlighted files from a specific category and make it format-ready for FDC reports.
  • Unused files: Allows user to see which files on any given category have not yet been used.
  • Wikimetrics new features: this tool now lets you know which users from your cohort are newly registered and also includes the new metric ‘Rolling active editor.’ Find out more on on this presentation!
  • Quarry: Allows you to run SQL queries against Wikipedia and other databases from your browser. Stay tuned for more documentation on this tool on the Evaluation portal on Meta!

For tool-driven program leaders, the new tools directory will come in handy to find these and other resources to measure online impact!

As we continue to work on the challenges that surfaced during conversations at Wikimania, we hope to continue the dialogue online with program leaders and grantees all over the world. We are working to connect talented people and good ideas across the movement, so we call out to movement leaders: stay connected, reach out and ask!

María CruzCommunity Coordinator of Program Evaluation & Design

by carlosmonterrey at August 22, 2014 10:08 PM

Wikimedia UK

Upcoming Training for Trainers session in Edinburgh

Attendees of the February 2014 Training the Trainers event

Wikimedia UK is committed to supporting our volunteers. To encourage them to teach others how to edit Wikipedia and other Wikimedia projects, we are running a weekend training workshop. This will take place on the weekend of 1-2 November in Edinburgh, and we would particularly encourage anyone from Scotland and the north of England to attend.

The workshop will be delivered by a professional training company and aims to improve delegates’ abilities to deliver any training workshop. It’s especially relevant to anybody who already runs Wikimedia-related training, or is very interested in doing so in near future.

The workshop is a chance to:

  • Get accredited and receive detailed feedback about your presenting and training skills
  • Get general trainer skills which you can then apply when e.g. delivering specific Wikipedia workshops
  • Share your skills with others
  • Help design a training programme that serves Wikimedia UK in the long term.

The course will run from 9:30 am-6:30pm on Saturday and 9am-5pm on Sunday. A light breakfast and lunch will be provided. We should also be able to cover travel and accommodation if you let us know in advance.

If you are interested in attending, please indicate your commitment by registering on this page but please note that places are limited.

If you are not able to attend this time but would like to take part in the future, please let us know by email to volunteering@wikimedia.org.uk – we will be offering more sessions in the future.

Please do not hesitate to contact us with any questions. We can also put you in touch with past participants who will be able to share their experiences with you.

by Katie Chan at August 22, 2014 05:14 PM

August 21, 2014

Wiki Education Foundation

For Senior Citizens Day, read about social issues concerning elderly care

Today, we celebrate Senior Citizens Day in honor of the contributions they make all over the U.S., which was created to raise awareness of social issues concerning the elderly. To honor this awareness, read the Wikipedia article about elderly care, which student editor Ellyhutch expanded during the spring 2012 term in Dr. Diana Strassmann and Dr. Anne Chao’s Poverty, Gender, and Development course.

During the assignment, Ellyhutch added information about gender discrepancies in treatment of the elderly, legal issues regarding incapacity, and examples of elderly care in developing countries. Ellyhutch’s improvements have raised awareness for more than 300,000 readers about these important social issues!

Jami Mathewson
Educational Partnerships Manager

by Jami Mathewson at August 21, 2014 05:05 PM

Niklas Laxström

Midsummer cleanup: YAML and file formats, HHVM, translation memory

Wikimania 2014 is now over and that is a good excuse to write updates about the MediaWiki Translate extension and translatewiki.net.
I’ll start with an update related to our YAML format support, which has always been a bit shaky. Translate supports different libraries (we call them drivers) to parse and generate YAML files. Over time the Translate extension has supported four different drivers:

  • spyc uses spyc, a pure PHP library bundled with the Translate extension,
  • syck uses libsyck which is a C library (hard to find any details) which we call by shelling out to Perl,
  • syck-pecl uses libsyck via a PHP extension,
  • phpyaml uses the libyaml C library via a PHP extension.

The latest change is that I dropped syck-pecl because it does not seem to compile with PHP 5.5 anymore; and I added phpyaml. We tried to use sypc a bit but the output it produced for localisation files was not compatible with Ruby projects: after complaints, I had to find an alternative solution.

Joel Sahleen let me know of phpyaml, which I somehow did not found before: thanks to him we now use the same libyaml library that Ruby projects use, so we should be fully compatible. It is also the fastest driver of the four. Anyone generating YAML files with Translate is highly recommended to use the phpyaml driver. I have not checked how phpyaml works with HHVM but I was told that HHVM ships with a built-in yaml extension.

Speaking of HHVM, the long standing bug which causes HHVM to stop processing requests is still unsolved, but I was able to contribute some information upstream. In further testing we also discovered that emails sent via the MediaWiki JobQueue were not delivered, so there is some issue in command line mode. I have not yet had time to investigate this, so HHVM is currently disabled for web requests and command line.

I have a couple of refactoring projects for Translate going on. The first is about simplifying the StringMangler interface. This has no user visible changes, but the end goal is to make the code more testable and reduce coupling. For example the file format handler classes only need to know their own keys, not how those are converted to MediaWiki titles. The other refactoring I have just started is to split the current MessageCollection. Currently it manages a set of messages, handles message data loading and filters the collection. This might also bring performance improvements: we can be more intelligent and only load data we need.

Théo Mancheron competes in the men's decathlon pole vault final

Aiming high: creating a translation memory that works for Wikipedia; even though a long way from here (photo Marie-Lan Nguyen, CC BY 3.0)

Finally, at Wikimania I had a chance to talk about the future of our translation memory with Nik Everett and David Chan. In the short term, Nik is working on implementing in ElasticSearch an algorithm to sort all search results by edit distance. This should bring translation memory performance on par with the old Solr implementation. After that is done, we can finally retire Solr at Wikimedia Foundation, which is much wanted especially as there are signs that Solr is having problems.

Together with David, I laid out some plans on how to go beyond simply comparing entire paragraphs by edit distance. One of his suggestions is to try doing edit distance over words instead of characters. When dealing with the 300 or so languages of Wikimedia, what is a word is less obvious than what is a character (even that is quite complicated), but I am planning to do some research in this area keeping the needs of the content translation extension in mind.

by Niklas Laxström at August 21, 2014 04:24 PM

Wikimedia UK

Free information, the internet and medicine

The image shows a small leaflet outlining the work of WikiProject: Medicine

This post was written by Vinesh Patel, a junior doctor and an alumnus of Imperial College, London

A new adventure for Wikimedia UK began this summer with a project in collaboration with Imperial College School of Medicine.

In a recent BBC article, Wikimedia UK highlighted the need for everyone looking for medical information to remember Wikipedia is simply an online encyclopedia, and nothing more.

A ganglion is a type of benign fluid collection that can form from fluid around tendons on your hand and some people used to claim it could be cured with a well judged thump with a Bible. However, evidence doesn’t support this practice. An encyclopedia with a similarly hard book covering would be judged by most laypeople today to be about as useful in solving such medical problems, and they would probably just see their doctor about a lump on their hand.Yet there seems to be a great tangle when the same information is put in an online encyclopaedia.

It is this tangle that is being explored by 3 groups of medical students, as they seek to edit selected Wikipedia articles within the field of medicine. 10 of them from different year groups are collaborating with senior academics to edit academic field they find interesting.

The format is they select a B or C class article from Wikiproject medicine and look to develop it over several months. They collaborate over several months to edit an article offline and then transcribe their work on to a WP page, having given notice they are going to conduct the edit on Wikipedia. One individual puts their work online after they . They receive help and guidance from senior academics. After putting their edits on WP they work with editors around the world to improve the article through normal routes of discussion on the talk page. The project is running from

The primary aim is to allow the students to develop their academic skills, but it is also hoped that the question of how free information on the internet is used in medicine will be given some practical answers. In future the program may be expanded to allow students to collaborate with students in developing countries. In fact, many students said the most inspiring aspect of the project is the potential to spread free medical information to their less privileged colleagues around the world, harnessing the possibilities of the internet.

by Stevie Benton at August 21, 2014 03:05 PM

August 20, 2014

Wikimedia Foundation

Remembering Jorge Royan

The is a syndicated post originally published by Wikimedia Argentina. The original Spanish version can be found here.

Wikimedia Argentina is saddened by the passing of our great friend and collaborator, the Argentinian architect and photographer Jorge Royan. Jorge was a winner of the National color photo Ranking by AFA and gold medal recipient from the International Federation of Photography (FIAP). Jorge also held various exhibits in local and international events. He was nominated by Agfa International as “professional of the month.”

As well as being a judge for Wiki Loves Monuments Argentina, Jorge donated hundreds of beautiful photos to Wikimedia Commons so that, in his own words, they won’t stay lost in his computer when he’s no longer around and serve a greater purpose other than just as a curiosity to his grandchildren. We hope that his wishes have been granted. Below you will find a small selection of Jorge’s work.

Thank you so much Jorge!

“A camera is like a bird that should be frozen in flight. To decide from where the bird looks into space (and with what eyes) is our job. Sometimes at ground level, others hang from a chandelier from four meters up. To obtain the wings is our responsibility.”

-Jorge Royan

Wedding photography, The Rudolfinum, Prague, Czech Republic

 

A skater in the Vondelpark. Amsterdam, The Netherlands

 

Multi-neck guitar, Paris, France

 

Jama Masjid the main mosque in Delhi, India

 

Via delle Oche, Italy

 

Violin repair shop, Salzburg, Austria

 

Ford Motor Company vintage Ford, Havana, Cuba

 

Maori rowing ceremonial choreography, New Zealand


We are currently looking to incorporate a photo of Jorge into this blog. If you have access to a freely licensed photo of Jorge, please contact us. Thank You.

by carlosmonterrey at August 20, 2014 11:26 PM

August 19, 2014

Wikimedia UK

Building the Open Access Button

This guest blog post was written by David Carroll, Open Access Button Project Lead

Earlier this month, as I sat at the Wikimania Open Data Hack in the Barbican, silently whirring in the back of my mind was an impending anniversary. It had been one year since the first line of code of what would later become the Open Access Button was written. The surroundings themselves were not dissimilar, a year earlier we were a little way across the city of London at The BMJ’s hack weekend and it was there we found an incredible team of developers to make the Open Access Button Beta a reality.

The motivation for building the Open Access Button came just a few months earlier, when in March 2013, Joe McArthur and I learnt that people are systematically denied access to research every day. hen we learnt this, we wanted to do something about it. Over the following months, we worked every second of our spare time with an amazing team of volunteer developers and in November 2013 we launched the Open Access Button.

The Button is a browser bookmarklet that allows users to report when they hit a paywall and are denied access to research. Being denied access to research is often an invisible problem and through the Button we aim to make the problem visible, collect the individual experiences, and showcase the global magnitude of the problem. 

So far we’ve tracked and mapped over 8,700 paywalls since the launch. These paywalls represent 8,700 times that scholars were denied access to research in their field ­ students couldn’t access additional sources for their thesis, or doctors couldn’t read the latest medical research. The stories collected so far are just the tip of a large iceberg of those being denied access to research.

The Button that currently exists at openaccessbutton.org is only a example of what we want to do in the future and since November, we’ve been working hard planning for the future of the Button and building a global, diverse student team supplemented by a professional steering committee to help us make the future Button the success.

Recently, we announced a partnership with Cottage Labs to further develop the Open Access Button. They will work on development of the Open Access Button, in addition to providing hardware and sysadmin support. In addition to re-building the Open Access Button, we’re also working with Wikipedians on the Signalling OA-ness project. This tool works is that when someone is making a citation, they can use the “Signalling OA” tool. This tool will be used to signal the “openness” of citations on Wikipedia, with the main purpose of this would be to spare readers the disappointment of clicking through to the resource only to find out that they cannot access it. It will also be useful for Wikipedia editors to see if citations are licensed in a way that allows for the images, media or even text to be reused in Wikipedia articles. We’ve been working on this recently and our contribution to this project should be ready by Wikimania next week.

The image shows a map which highlights the places where research has been noted as sitting behind a paywall

A screenshot of the Open Access Button map

The collaboration with Cottage Labs is provided in kind to build core functions of the future Open Access Button and to drive us forward but this in kind support is just for the core functions, to achieve everything we want to achieve and more, we still need your support.

The Open Access Button Beta was built with the support of the Open Access Community and a small team of incredible developers who worked with us on a volunteer basis because of their dedication to Openness and as we develop the future Open Access Button, we want that community spirit to continue. If you’re a developer committed to Openness in your work wanting to lend a hand, find us at Github and if you’re a publisher, a library, an organisation or an individual committed to opening up knowledge for all get in touch. If you can offer financial support, in kind support or just some helpful words of advice, we would love to hear from you. With your support, we can meet our goals and launch the best button we can and continue to make the problems of paywalls impossible to ignore.

As we work towards the next launch, we are going to continue collecting data and user stories in order to advocate for open access. If you’re hitting paywalls, don’t be silent, download the Button at openaccessbutton.org and report each time you’re denied access to research.

To stay up to date on the progress of Button, you can follow us on Twitter, like us on Facebookread our blog or email us at openaccessbutton@medsin.org.

 

by Stevie Benton at August 19, 2014 02:54 PM

August 18, 2014

Wikimedia Foundation

Wikipedia in the classroom: Empowering students in the digital age

Anne in front of the Library at Diablo Valley College.

During her last year of high school, Anne Kingsley took a variety of classes at Sierra College, her local community college in Rocklin, CA. The experience greatly influenced her decision to pursue a career in teaching. “I loved the atmosphere of the community college and remember spending a lot of time printing out articles and copying books in the library,” Anne recalled. “I remember study groups with recent high school grads, returning students, veterans, single moms.”

The eclectic nature of the community college served her well in her first teaching position in 2002 at a New York organization called Friends of Island Academy (FOIA), where she helped youth in the criminal justice system gain literacy and other basic skills. At that time, the Internet was starting to become a valuable educational resource that would soon make photocopying books in the library a nostalgic pastime. Her time at FOIA was the beginning for discovering innovative ways to solve big educational problems. “Because I had to run a classroom that had very little materials and almost no budget, you had to be creative about content and curriculum design,” explained Kingsley. “This was a powerful experience to build a foundation for classroom experience as it taught me how to think outside of conventional teaching practices.”

Diablo Valley Community college.

Anne went on to teach at Northeastern University, Menlo College and Santa Clara University. While at Northeastern she pursued her doctorate and was part of a training program where the faculty encouraged curriculums that incorporated new media into the classroom. “This was the beginning of blogs and Facebook, so I remember experimenting with these kinds of shared information sources,” said Anne. At the same time Wikipedia, only a few years old at the time, was becoming an increasingly comprehensive encyclopedia. Though in its onset Wikipedia had a reputation for being discouraged by teaching professionals, it has since slowly garnered support and trust from a number of institutions. Today Anne teaches at Diablo Valley College in Pleasant Hill, California, and finds herself once again experimenting with different teaching methods, including the use of Wikipedia.
Tired of assigning the standard research paper and disillusioned by its merits in the 21st century, Anne started to realize that technology has greatly altered the way we access information. Anne elaborates, “I kept thinking that technology has changed the place for research, so why do we keep handing in these static articles as though information doesn’t shift and change all the time. I also knew that old research papers that I had assigned my students were literally piled up in my closet, shoved into boxes, and forgotten about.”

Wikipedia in the classroom.

Simultaneously, Anne kept hearing about underrepresented histories on Wikipedia – from women’s literature to African American history. Though underrepresentation of marginalized subjects is still a concern on Wikipedia, much is being done to address it thanks to people like Anne. “Given that I was teaching at a community college, I figured, let’s see what my students could do with Wikipedia. We all use Wikipedia, so why not see if we could become producers of information rather than just consumers.”

As a Harlem Renaissance enthusiast, Anne taught a course titled “Critical thinking: Composition and Literature Reading the Harlem Renaissance.” It was during this course that she experimented with her idea of producing information in a public forum as a method of learning. Part of the course was to edit articles pertaining to the Harlem Renaissance that were not covered fully on Wikipedia. Using online publications like The Crisis Magazine — an important early 20th century publication for African American culture — the students set out on a journey to research, edit and contribute to the world’s largest encyclopedia.

Humanities Building Classroom at DVC.

Anne and her students soon became aware of the initial learning gap that many new editors face with regards to the Wikipedia syntax. Though somewhat intimidating at first, Anne agrees that editing Wikipedia was a great way to teach students how to become literate in new media language. Her students weren’t the only ones learning something new, Anne explains, “It certainly opened their (and my) eyes to what takes place behind the nicely edited entries.” Another obstacle was trying to figure out how and where to contribute. Anne recalls a student who was hoping to contribute a “religion” entry to the Harlem Renaissance page. The challenge was to figure out where it belonged and how they would go about incorporating it into an existing page in a cohesive manner. Despite a period of adjustment, Anne makes it clear that the benefits her and the students garnered greatly outnumber any difficulties they might initially have had.

From an academic perspective, the assignment captured many of the elements of research that the course aimed to teach – understanding of source material, citation, scholarly research and careful language craft. The fact that Wikipedia is a public forum motivated the students in a manner that perhaps a normal research paper wouldn’t, that is to say, it no longer was just the professor who read the work but also other editors from around the world. The project also proved to be a great collaboration process between the students and the professor. The project lent itself to broader collaboration, especially when it came to the selection process and some of the smaller nuances of contributing to Wikipedia. The project also seemed to greatly improve composition, says Anne, “They (the students) would literally groom their language sentence by sentence – as opposed to earlier experiences writing seven-page research papers where the language fell apart.” Perhaps most satisfying for the students was the sense of accomplishment in seeing their hard work in a public space. Among the new articles created were pages for Arthur P. Davis, a section for religion in the Harlem Renaissance article and a page for Georgia Douglas Johnson – formerly a stub.

Anne expresses great interest in assigning this project again to her students. “I don’t always get to select the classes I teach, but if I had the opportunity to teach the Harlem Renaissance again, I would repeat this curriculum.” When asked what she would do differently, if anything, she replied, “More time. I only gave my students four weeks to create their entries. I did not realize how many of them would choose to create full-length articles or more complex entries.” Anne is part of a growing number of teaching professionals who choose to think outside the box and embrace new mediums in an effort to not only contribute to the greater good, but also prepare their students for a 21st century academic landscape. She had a clear message to her colleagues who perhaps might not be as embracing of Wikipedia in the classroom, she says, “Think big…students have this amazing capacity to want to experiment with you and others, especially when it makes their work visible and meaningful.”

Carlos Monterrey, Communications Associate at the Wikimedia Foundation

by carlosmonterrey at August 18, 2014 09:42 PM

Wiki Education Foundation

Welcome, Lorraine Hariton

A very warm welcome to Lorraine Hariton, the newest board member of the Wiki Education Foundation. I’m thrilled about Lorraine joining the Wiki Education Foundation board and bringing her expertise with technology and board service to our organization. I’m pleased to see the Wiki Education Foundation board shaping up so well with strong members like Lorraine who can help us achieve our goals to be the link between Wikipedia and academia in the United States and Canada.

I look forward to working with Lorraine in the coming months and years.

Frank Schulenburg
Executive Director

by Frank Schulenburg at August 18, 2014 05:12 PM

Magnus Manske

The Men Who Stare at Media

Shortly after the 2014 London Wikimania, the happy world of Wikimedia experienced a localized earthquake when a dispute between some editors of the German Wikipedia and the Wikimedia Foundation escalated into exchanges of electronic artillery. Here, I try to untangle the threads of the resulting Gordian knot, interwoven with my own view on the issue.

Timeline

As best as I can tell, the following sequence of events is roughly correct:

  1. The WMF (Wikimedia Foundation) decides to update and, at least by intention, improve the viewing of files (mostly images), mainly when clicked on in Wikipedia. The tool for this, dubbed MediaViewer, would do what most people expect when they click on a thumbnail on a website in 2014, and be activated by default. This is aimed at the casual reader, comprising the vast majority of people using Wikipedia. For writers (that is, “old hands” with log-ins), there is an off switch.
  2. A small group of editors on English Wikipedia suggest that the MediaViewer, at least in its current state, is not suitable for default activation. This is ignored by the WMF due to lack of total votes.
  3. A “Meniungsbild” (literally “opinion picture”; basically, a non-binding poll) is initiated on German Wikipedia.
  4. The WMF posts on the Meinungsbild page that it (the WMF) reserves the right to overrule a negative result.
  5. About 300 editors vote on German Wikipedia, with ~2/3 against the default activation of the MediaViewer.
  6. The WMF, as announced, overrules the Meinungsbild and activates the MediaViewer by default.
  7. An admin on German Wikipedia implements a JavaScript hack that deactivates the MediaViewer.
  8. The WMF implements a “super-protect” right that locks out even admins from editing a page, reverts the hack to re-enable the MediaViewer, and protects the “hacked” page from further editing.
  9. Mailing list shitstorm ensues.

An amalgamate of issues

In the flurry of mails, talk page edits, tweets, blog posts, and press not-quite-breaking-news items, a lot of issues were thrown into the increasingly steaming-hot soup of contention-laden bones. Sabotage of the German Wikipedia by its admins, to prevent everyone from reading it, was openly suggested as a possible solution to the problem, Erik Möller of WMF was called a Nazi, and WMF management is raking in the donations for themselves while only delivering shoddy software. I’ll try to list the separate issues that are being bundled under the “MediaViewer controversy” label:

  • Technical issues. This includes claims that MediaViewer is useless, not suitable for readers, too buggy for prime time, violates copyright by hiding some licenses, etc.
  • WMF response. Claims that the Foundation is not responding properly to technical issues (e.g. bug reports), community wishes, etc.
  • WMF aim. Claims that the Foundation is focusing exclusively on readers and new editors, leaving the “old hands” to fend for themselves.
  • Authority. Should the WMF or the community of the individual language edition have the final word about software updates?
  • Representation: Does a relatively small pool of vocal long-time editors speak for all the editors, and/or all the readers?
  • Rules of engagement: Is it OK for admins to use technological means to enforce a point of view? Is it OK for the WMF to do so?
  • Ownership: Does the WMF own Wikipedia, or do the editors who wrote it?

A house needs a foundation

While the English word “foundation” is know to many Germans, I feel it is often interpreted as “Verein”, the title of the German Wikimedia chapter. The literal translation (“Fundament”), and thus its direct meaning, are often overlooked. The WMF is not “the project”; it is a means to an end, a facilitator, a provider of services for “the community” (by whatever definition) to get stuff done. At the same time, “the community” could not function without a foundation; some argue that the community needs a different foundation, because the next one will be much better, for sure. Thankfully, these heroic separatists are a rather minute minority.

The foundation provides stability and reliability; it takes care of a lot of necessary plumbing and keeps it out of everyone’s living room. At the same time, when the foundation changes (this is stretching the literal interpretation of the word a bit, unless you live in The Matrix), everything build on the foundation has to change with it. So what does this specific foundation provide?

  • The servers and the connectivity (network, bandwidth) to run the Wikis.
  • The core software (MediaWiki) and site-specific extensions. Yes, since it’s open source, everyone can make a fork, so WMF “ownership” is limited; however, WMF employs people to develop MediaWiki, with the specific aim of supporting WMFs projects. Third-party use is wide-spread, but not a primary aim.
  • The setup (aka installation) of MediaWiki and its components for the individual projects.
  • The people and know-how to make the above run smoothly.
  • Non-technical aspects, such as strategic planning, public relations and press management, legal aspects etc. which would be hard/impossible for “the community” to provide reliably.
  • The money to pay for all of the above. Again, yes, the money comes from donation; but WMF collects, prioritizes, and distributes it; they plan and execute the fundraising that gets the money in.

The WMF does specifically not provide:

  • The content of Wikipedia, Commons, and other projects.
  • The editorial policies for these projects, beyond certain basic principles (“Wikipedia is an encyclopedia, NPOV, no original research”, etc.) which are common to all language editions of a project.

Authorities

I think that last point deserves attention in the light of the battle of MediaViewer. The WMF is not just your hosting provider. It does stand for, and is tasked to uphold, some basic principles of the project, across communities and languages. For example, the “neutral point of view” is a basic principle on all Wikipedia. What if a “community” (again, by whatever definition) were to decide to officially abandon it, and have opinionated articles instead? Say, the Urdu edition, a language mostly spoken in Pakistan (which I chose as a random example here!). I think that most editors, from most “communities”, would want the WMF to intervene at that point, and rightly so. You want opinionated texts, get a blog (like this one); the web is large enough. In such a case, the WMF should go against the wishes of that “community” and, if necessary, enforce NPOV, even if it means to de-admin or block people on that project. And while I hope that such a situation will never develop, it would be a case were the WMF would, and should, enforce editorial policy (because otherwise, it wouldn’t be Wikipedia anymore). Which is a far more serious issue than some image viewer tool.

The point I am trying to make here is that there are situations where it is part of the mission and mandate of WMF to overrule “the community”. The question at hand is, does MediaViewer comprise such a situation? It is certainly a borderline case. On one hand, seen from the (German) “community” POV, it is a non-essential function that mostly gets in the way of the established editors that are most likely to show up on the Meinungsbild, and admittedly has some software issues with a generous sprinkling of bug reports. On the other hand, from the WMF’s point of view, the dropping number of editors is a major problem, at it is their duty to solve it as best as they can. Some reasons, e.g. “newbie-biting”, are up to the communities and essentially out of the WMF’s control. Other reasons for the lack on “fresh blood” in the wiki family include the somewhat antiquated technology exposed to the user, and that is something well within its remit. The Visual Editor was developed to get more (non-technical) people to edit Wikipedia. The Upload Wizard and the MediaViewer were developed to get more people interested in (and adding to) the richness of free images and sounds available on the sites.

The Visual Editor (which seems to work a lot better than it used to) represents a major change in the way Wikipedia can be used by editors, and its initial limitations were well known. Here, the WMF did yield to the wishes of individual “communities”, and not even an option for the Visual Editor is shown on German Wikipedia for “anonymous” users.

The MediaViewer is, in this context, a little different. Most people (that is, anonymous readers of Wikipedia, all of which are potential future editors) these days expect that, when you click on a thumbnail image on a website, you see a large version of it. Maybe even with next/prev arrows to cycle through available images on the page. (I make no judgement about whether this is the right thing; it just is this way.) Instead, Wikipedia thus far treated the reader to a slightly larger thumbnail, surrounded by mostly incomprehensible text. And when I say “incomprehensible”, I mean people mailing me if they could use my image from Commons; they skip right past the {{Information}} template and the license boxes to look for the uploader, which happens to be my Flickr/Wikipedia transfer bot.

So the WMF decided that, in this specific case, the feature should be rolled out as default, on all projects instead of piecemeal like the Visual Editor (and do not kid yourself, it will come to every Wikipedia sooner or later). I do not know what prompted this decision; consistency for multilingual readers, simplicity of maintenance, pressure on the programmers to get the code into shape under the ensuing bug report avalanche, or simply the notion of this being a minor change that can be turned off even by anonymous users. I also do not know if this was the right technical decision to make, in light of quite a few examples where MediaViewer does not work as correctly as it should. I am, however, quite certain that it was the WMF’s right to make that decision. It falls within two of their areas of responsibility, which are (a) MediaWiki software and its components, and (b) improving reader and editor numbers by improving their experience of the site. Again, no judgement whether or not it was the right decision; just that it was the WMF’s decision to make, if they chose to do so.

Respect

I do, however, understand the “community’s” point of view as well; while I haven’t exactly been active on German Wikipedia for a while, I have been around through all of its history. The German community is very dedicated to quality; where the English reader may be exposed to an army of Pokemons, the article namespace in German Wikipedia is pruned rather rigorously (including an article about Yours Truly). There are no “mispeeling” redirects (apparently, if you can’t spell correctly, you have no business reading an encyclopedia!), and few articles have infoboxes (Wikipedia is an encyclopedia, not a trading card game!). There are “tagging categories”, e.g. for “man” and “woman”, with no subcategories; biographies generally have Persondata and authority control templates. In short, the German community is very much in favor of rigorously controlling many aspects of the pages, in order to provide the (in the community’s view) best experience for the user. This is an essential point: the German community cares very much about the reader experience! This is not to say that other languages don’t care; but, in direct comparison, English Wikipedia is an amorphous free-for-all playground (exaggerating a bit here, but only a bit). If you don’t believe me, ask Jimbo; he speaks some German, enough to experience the effect.

So some of the German editors saw (and continue to see) the default activation of the MediaViewer as an impediment to not only themselves, but especially to the reader. And while Germans are known for their “professional outrage”, and some just dislike everything new (“it worked for me so far, why change anything?”), I believe the majority of editors voting against the MediaViewer are either actually concerned about the reader experience, or were convinced (not to say “dragged into”) by those concerned to vote “no”.

The reactions by the WMF, understandably as they are from their perspective, namely

  1. announcing to ignore the “vote” (not a real, democratic vote, which is why it’s called “Meinungsbild” and not “Wahl”)
  2. proceeding to ignore the vote
  3. using “force” to enforce their decision

were interpreted by many editors as a lack of respect. We the people editors wrote the encyclopedia, after all; how dare they (the WMF) change our carefully crafted user experience, and ignore our declared will? It is from that background that comparisons to corporate overlords etc. stem, barely kept in check by Mike Godwin himself. And while such exaggerations are a common experience to everyone on the web, they do not exactly help in getting the discussion back to where it should be. Which is “where do we go from here”?

The road to hell

One thing is clear to me, and I suspect even to the most hardened edit warrior in the wikiverse: Both “sides”, community and WMF, actually want the same thing, which is to give the reader the best experience possible when browsing the pages of any Wikimedia project. The goal is not in question; the road to get there is. And whose authority it is to decide that.

On the technical side, one issue is the testing-and-fixing cycle. Traditionally, the WMF has made new functionality available for testing by the community quite early. By the same tradition, that option is ignored by most members of that community, only to complain about being steamrollered into it when it suddenly appears on the live site. On the other hand,  the WMF has rolled out both the Visual Editor and the MediaViewer in a state that would be called “early beta” in most software companies. “Release early, release often” is a time-honored motto in open source software development; but in this specific case, using early releases in production isn’t optional for the users. From discussions I had on Wikimania, I have the distinct impression that people expect a higher standard of quality for software rolled out by the WMF on the live sites, especially if it becomes default. How this should work without volunteers to test early remains a mystery; maybe a little more maturity on the initial release, followed by more widespread use of “beta” features, is part of the answer here.

On the votes-vs-foundation side, I am of the opinion that clearer lines need to be drawn. The WMF does have a responsibility for user experience, which includes software changes, some of which will have to be applied across the wikiverse to be effective; the upcoming “forced account unification” for (finally!) Single User Login comes to mind. And, in a twist on the famous Spiderman quote, with great responsibility needs to come great power to fulfill it. Responsibility without power is the worst state one can have in a job, which even the most uncompromising “community fighter” will agree to. So if and when the WMF makes such a decision within their remit, the energy of the community would be best spent in feeding back the flaws in order to get the best possible result, instead of half-assed attempts at sabotage (I much prefer full-assed attempts myself).

There is, of course, another side of that coin. In my opinion, the WMF should leave the decision for default activation of a new feature to a representative vote of a community, unless the activation is necessary for (a) technical, (b) consistency, or (c) interdependency reasons. A security fix would fall under (a); the Single User Login will fall under (c); MediaViewer falls under (b), though somewhat weakly IMHO. Now, the key word in the beginning of this paragraph is “representative”. I am not quite sure how this would work in practice. I am, however, quite sure it is not 300 editors (or Spartans) voting on some page. It could include votes by a randomized subset of readers. It could also include “calls to vote” as part of beta features, e.g. if you had the feature enabled in the last week. These could be repeated over time, as the “product” would change, sometimes significantly so, as it happened with the Visual Editor; a “no” three month ago would be quite invalid today.

Finally, I believe we need at least part of the above written out, and agreed upon, by both the WMF and “the communities”. It is my hope that enough people will share my opinion that both “parties” still have a common goal. Because the house that is Wikipedia cannot stand without a foundation, and a foundation without a house on top is but a dirty pond.

by Magnus at August 18, 2014 03:38 PM

Gerard Meijssen

#Twitter - #WikiParliaments.. but what about #Wikidata and #Austria?

Twitter advertised several things that I might like. WikiParliaments could be one of them. Today I learned that Othmar Tödling died. He was a member of the "Nationalrat" of Austria. As such he might be very much of interest to WikiParliaments.

Politicians are human too; they die. When they do, it is often noted in a category what function they held. Today I started adding statements for those humans who hold or held the function of parliamentarian in Austria.

My hope is that people who care about parliaments will make it even prettier and embellish them with even more statements and qualifiers.
Thanks,
     GerardM

by Gerard Meijssen (noreply@blogger.com) at August 18, 2014 09:00 AM

Santhosh Thottingal

Talk at Wikimania 2014

I presented the Content Translation project of my team at Wikimania 2014 at London. Here is the video of the presentation.

<iframe allowfullscreen="allowfullscreen" frameborder="0" height="360" src="http://www.youtube.com/embed/b6qvv3eJ_Ag?start=1947" width="640"></iframe>

by Santhosh Thottingal at August 18, 2014 03:09 AM

Tech News

Tech News issue #34, 2014 (August 18, 2014)

TriangleArrow-Left.svgprevious 2014, week 34 (Monday 18 August 2014) nextTriangleArrow-Right.svg
Other languages:
Deutsch • ‎English • ‎español • ‎suomi • ‎français • ‎עברית • ‎italiano • ‎日本語 • ‎português • ‎தமிழ் • ‎українська • ‎中文

August 18, 2014 12:00 AM

August 17, 2014

Gerard Meijssen

#MediaWiki - #MediaViewer rehashed

Some things are plain stupid, sometimes I am and sometimes someone else is. I filed a bug about my experience of the MediaViewer. For me it is a show stopper; it prevents me from using it easily.

The problem is that Chrome shows a really awful URL for an image with funny characters in its title. When I look at it using the MediaViewer it is bad but it looks fine when I look at it from the Commons page.
  • File:%C3%89cole_normale_sup%C3%A9rieure_de_Paris,_26_January_2013.jpg
  • File:École normale supérieure de Paris, 26 January 2013.jpg
According to the Bugzilla triage I must be stupid because it works; it complies with specifications and, indeed technically it works. It just stopped working for me.

Several reactions are possible. My choice was to shrug, mutter "it is the user experience stupid" and I got on with my life. Others find it a precursor to the invasion of an evil overlord who does not understand the world and prepare for war.

By filing a bug, by posting this blog I have rid myself of my frustrations. I know several developers; I met many of them at Wikimania and I know they are really dedicated and mean well. I also know that such things pass. I am sure someone will see the light or Google will fix Chrome (if that is where the bug lives). In the end I do not look at images that often as a result.
Thanks,
       GerardM

by Gerard Meijssen (noreply@blogger.com) at August 17, 2014 08:56 PM

#Wikidata - giving a #category an application

Many #Wikimedia categories have interlanguage links. Obviously the content of all these linked categories do not have the same content. Someone has to add the articles, sometimes it gets done and sometimes it doesn't. Often articles just do not exist.

When the facts that are implicit in what a category is about make it to all the items in all the categories, typically you have a superset in Wikidata. It does not stop there; items in Wikidata may be included that are not in any of those linked categories.

This is all theoretical unless ... unless you can query Wikidata and use the results. Much data has been added to Wikidata based on the content of categories and queries have been used to identify missing items this is done using AutoList2. This is one application; it is used by some of the "advanced" users of Wikidata.

What is even more interesting is showing what Wikidata things should be in a category. This is done using Reasonator. At this time for over 690 categories statements are included that define a query. This query is already complex enough that the Wikidata functionality will not be able to express the results..

These queries could be of use to "advanced" Wikipedians because it is a basis for identifying articles that have not been categorised or articles that still need to be written in their Wikipedia. For everyone else it is just interesting; this information exists and it is readily available. It is one way of learning that Wikidata knows for instance about 121,922 politicians.
Thanks,
      GerardM

by Gerard Meijssen (noreply@blogger.com) at August 17, 2014 02:06 PM

#Wikidata - sources or confidence

At this time Wikidata has more than 36,396,372 statements these statements are associated with some 15,335,451 items. The majority of these items have less than five statements and even worse for many items it is not known what they are about.

When you consider the quality of this data, there are two schools of thought. There are those who insist on sources with every statement and, there are those who have confidence in the validity of the data because they know where it came from.

Either way, when you want to assert that a specific approach is superior, it becomes a numbers game and, understanding the relative merits is what it is all about. When something is sourced, you can be confident that it is highly probable at the time of the sourcing. There is however no certainty that the data remains stable. Confidence can be maintained by regularly comparing the data with what the source has to say.

When the data is regularly compared, it does not matter that much if Wikidata has source information itself. The source is typically one of the Wikipedias and they are said to have sources, this may provide us with enough reasons for confidence. The comparison of data increases this confidence particularly when multiple sources prove to be in agreement.

Practically, the basic building blocks to start comparing exist. It has been done before by Amir and he produced long lists of differences. Three things are needed to establish new best practices:
  • a well defined place needs to found where such reports may be found
  • communities need to understand that it raises confidence in their project
Thanks,
   GerardM

by Gerard Meijssen (noreply@blogger.com) at August 17, 2014 01:54 PM

August 16, 2014

Tony Thomas

Writing PHP unit-tests : How to add a fake user table entry

Flexibility of the Mediawiki PHP unit-tests wrapper extends to the fact that fake database entries can be made and tested upon without causing any harm to the actual one. As I scribbled in the earlier post, the class-comments play a vital role, and dont forget to give them like this: This would create a fake […]

by Tony Thomas at August 16, 2014 03:35 PM

Writing PHP Unit tests to verify extension API POST

PHP unit tests are crucial before deployment to make sure that the degree of damage your extension can cause is minimal. How to start was always my worry, and here we go. Considering I have an API called ‘myapi’ that POST’s string $myvar : The job = submitted makes sure that the job is completed, […]

by Tony Thomas at August 16, 2014 02:12 PM

Gerard Meijssen

#Wikidata - application for its long tail

When Lauren Bacall died this week, it was all over in the news. When Marjorie Stapp died on June 2, 2014 it was noted in the English Wikipedia only yesterday. Today it is known to Wikidata and, several bits of information where added to the item about Mrs Stapp as well.

Among those statements is her identifier in the IMDB. The IMDB does not know yet about the demise of Mrs Stapp and it is not unlikely that there are more actors and actresses we know about that have died. Providing external sources like the IMDB with an RSS feed of the changes that are made in Wikidata is not hard.

When we share our information in this way, we gain friends. With these new friends we may do friendly things like noting differences between the data that we hold. Equally important, we add a reason why people might maintain the data that is in Wikidata. As our data gains in application, we will grow and diversify our community.
Thanks,
     GerardM

by Gerard Meijssen (noreply@blogger.com) at August 16, 2014 06:50 AM

August 15, 2014

Wiki Education Foundation

Student editor translates French article into English

Have you ever looked up a city or region on Wikipedia to learn more about it—perhaps before traveling there? If you’re on English Wikipedia and searching for a non-English-speaking place, you may find minimal information beyond its geography.

Dr. Julie McDonough Dolmaya’s spring 2014 translation course at York University sought to minimize the discrepancies between French Wikipedia and English Wikipedia, as student editors developed their translation skills by selecting an article to translate and expand.

Student editor Azink2 found the article about Aubagne and more than doubled its content. Now, the 2,000+ readers per month on English Wikipedia can also learn about the history, demographics, and politics of the region!

Jami Mathewson
Program Manager

by Jami Mathewson at August 15, 2014 05:04 PM

Wikimedia UK

The GLAM-Wiki Revolution

This post was written by Joe Sutherland and User:Rock drum

During Wikimania 2014 last week, we were lucky enough to be able to screen our documentary about the GLAM-Wiki programme in the UK. The film brings together interviews with some of the Wikimedians in Residence from institutions across the country – and with Wikimedia UK staff. We want it to function as an outreach tool – as a way of teaching people about the GLAM programme, but also as a celebration of the work of so many volunteers and paid Wikimedians in Residence.

Over the coming weeks we will be sharing additional content from this project, written interviews and shorter videos which will also be published through Wikimedia UK’s channels. We will also be releasing some of the source footage on Wikimedia Commons under a Creative Commons license.

We are pleased to be able to share this video online, both on YouTube and Wikimedia Commons. We hope you enjoy it.

<iframe allowfullscreen="allowfullscreen" frameborder="0" height="360" src="https://www.youtube-nocookie.com/embed/UlNT16gqHyo?list=PL66MRMNlLyR6BuplUUTWvyl4_klBOZzNT" width="640"></iframe>

by Richard Nevell at August 15, 2014 01:54 PM

August 14, 2014

Wiki Education Foundation

Wiki Ed presents at Wikimania

We’re back from an amazing trip to London! Last week, four Wiki Education Foundation staff and three board members were in the thick of the global Wikimedia community at the annual Wikimania conference. This year Wikimania had a strong focus on education, and it drew large crowds of people from all over the world who were interested in incorporating Wikipedia into educational settings. In our sessions, the Wiki Education Foundation contingent shared our knowledge with the community so that others could learn from what we’ve done.

Ask Wiki Ed

LiAnna Davis, Frank Schulenburg, and Jami Mathewson at the “Ask the Wiki Education Foundation” session.

Educational Partnerships Manager Jami Mathewson and Director of Programs LiAnna Davis were active leaders in the Education Pre-Conference prior to Wikimania. Jami presented during a Wikipedia Ambassador training for people interested in supporting class-based programs globally, as well as a session for helping new program leaders find the best structure for their programs. LiAnna led a half-day workshop on how to use Wikipedia as a teaching tool. All three sessions were very well-attended, with more than 50 people representing at least 18 countries around the world in attendance.

More than 100 attended an education session in which Wiki Education Foundation staff presented in all three speaking slots. First up was “Ask the Wiki Education Foundation“, where LiAnna, Jami, and Frank presented information about our organization and then opened the floor for questions from the audience. Next up, LiAnna joined colleagues from the Wikimedia Foundation, Israel, and the United Kingdom to present information about the Wikipedia Education Collaborative, a group of education program leaders worldwide who coordinate sharing learnings. Finally, LiAnna presented “The 7 Biggest Mistakes the Wikipedia Education Program’s Made — and What We’ve Learned From Them“. All three had a great response from the audience, with interesting questions.

Diana Strassmann

Diana Strassmann gives a keynote. (Photo by SLOWKING under CC BY-NC via Wikimedia Commons)

Wiki Ed’s Board Chair, Diana Strassmann, had a featured speaker slot on Saturday. Diana’s talk, which you can watch online, focused on how the Wikipedia Education Program can help Wikipedia. Diana highlighted different theories of knowledge that come from academia, and how these might help Wikipedia overcome some of its current challenges, including the gender gap. Diana also joined Wikipedia Founder Jimmy Wales and other guests for a BBC World radio interview on a show called “In the Balance” focusing on “The Future of Education” to talk about her experiences using Wikipedia as a teaching tool.

Finally, Jami joined program leaders from the Arab World, Israel, and Mexico in a session exploring how the Wiki Education Foundation uses numerical data to evaluate our programmatic efforts.

In addition to the scheduled activities, Wiki Ed staff and board members participated in a number of social events, meeting with Wikipedia editors and program leaders globally to share our experiences teaching students how to edit Wikipedia as part of their coursework and learning from others’ experiences. Many thanks to the Wikimania organizing team for putting education front-and-center at this year’s conference!

LiAnna Davis
Director of Programs

by LiAnna Davis at August 14, 2014 03:12 PM

Gerard Meijssen

#Wikimedia - the quality of access to the sum of all human knowledge


Again, a big flare up of "we the community" demand this and that. Again what Wikipedia, the Wikimedia Foundation is about is conveniently forgotten. At Wikimania there was a really interesting presentation by Raph Koster author of a "Theory of Fun for Game Design". Well recommended once it is available for viewing..

An abstraction of the current huha is in there and this community is described as the monsters who rule it all (my words, his pictures). These people who impose their world on others have forgotten what the game is about. It is about providing access to the sum of all knowledge. From that perspective their issues with the multimedia viewer are hardly significant compared with the increased ease for people who just access the parts of human knowledge we do give access to.

My pet example of "the community" not caring about providing access to our available knowledge is in the decision that easy and obvious access to fonts adds clutter to the user interface and is therefore not acceptable... About seven percent of a population is dyslexic and it is extremely hard to find and enable the OpenDyslexic font. It took a MediaWiki developer over two minutes and he enabled it in a way I did not know existed... He knew it existed, he knew the name of the font. This demonstrates how relevant seven percent of our reader population to our community is.

Should we primarily care about access or is it a playground for monsters?
Thanks,
      GerardM

by Gerard Meijssen (noreply@blogger.com) at August 14, 2014 08:33 AM

#Wikidata - It ain't got a thing

A rose, a rose, is a rose by any other name as beautiful.. Eh actually people are quite smart and know a rose when they see one. Machines need to be told what is a rose.

Wikidata has this requirement of being usable by machines. So we need to know what thing a thing is and for all humans it needs to be stated that all of them are considered human.

Several high powered people at Wikimania expressed the opinion that for Wikidata to get in full swing, we have to identify every thing.

I have identified a few hundred "list articles". Items that start with "List of " or "Member of " for instance. I have identified a lot of "group of people" who were supposed to be born in the XXth century.

At Wikidata a thing is bad. We cannot safely select it, we cannot auto describe it. We should get rid of every thing.
Thanks,
      GerardM

by Gerard Meijssen (noreply@blogger.com) at August 14, 2014 06:47 AM

August 13, 2014

This month in GLAM

This Month in GLAM: July 2014

by Admin at August 13, 2014 09:21 PM

Sumana Harihareswara

Case Study of a Good Internship

I'm currently a mentor for Frances Hocutt's internship in which she evaluates, documents, and improves client libraries for the MediaWiki web API. She'll be finishing up this month.

I wanted to share some things we've done right. This is the most successful I've ever been at putting my intern management philosophy into practice.

  • A team of mentors. I gathered a co-mentor and two technical advisors: engineers who have different strengths and who all promised to respond to questions within two business days. Frances is reading and writing code in four different languages, and is able to get guidance in all of them. The other guys also have very different perspectives. Tollef has worked in several open source contexts but approaches MediaWiki's API with learner's mind. Brad has hacked on the API itself and maintains a popular Wikipedia bot that uses it. And Merlijn is a maintainer of an existing client library that lots of Wikimedians use. I bring deep knowledge of our technical community, our social norms, and project management. And I'm in charge of the daily "are you blocked?" communication so we avoid deadlocks.

  • Frequent communication. Any time Frances needs substantial guidance, she can ping one of her mentors in IRC, or send us a group email. She also updates a progress report page and tells our community what she's up to via a public mailing list. We have settled into a routine where she checks in with me every weekday at a set time. We videochat three times a week via appear.in (its audio lags so we use our cell phones for audio), and use a public IRC channel the other two weekdays. We also frequently talk informally via IRC or email. She and I have each other's phone numbers in case anything is really urgent.

  • Strong relationship. I met Frances before we ever thought about doing OPW together. I was able to structure the project partly to suit her strengths. We've worked together in person a few times since her project started, which gave us the chance to tell each other stories and give each other context. I've encouraged her to submit talks to relevant conferences, and given her feedback as she prepared them. Frances knows she can come to us with problems and we'll support her and figure out how to solve them. And our daily checkins aren't just about the work -- we also talk about books or silliness or food or travel or feminism or self-care tips. There's a healthy boundary there, of course, since I need to be her boss. But our rapport makes it easier for me to praise or criticize her in the way she can absorb best.

  • Frances is great. I encouraged her as an applicant; from her past work and from our conversations, I inferred that she was resourceful, diligent, well-spoken, analytical, determined, helpful, and the kind of leader who values both consensus and execution. I know that many such people are currently languishing, underemployed, underappreciated. A structured apprenticeship program can work really well to help reflective learners shine.

    I got to know Frances because we went to the same sci-fi convention and she gave me a tour of the makerspace she cofounded. Remember that just next to the open source community, in adjacent spaces like fandom, activism, and education, are thousands of amazing, skilled and underemployed people who are one apprenticeship away from being your next Most Valuable Player.


  • Scope small & cuttable. Frances didn't plan to make one big monolithic thing; we planned for her to make a bunch of individual things, only one of which (the "gold standard" by which we judge API client libraries) needed to happen before the others. This came in very handy. We hadn't budgeted time for Frances to attend three conferences during the summer, and of course some programming bits took longer than we'd expected. When we needed to adjust the schedule, we decided it was okay for her to evaluate eight libraries in four languages, rather than eleven in five languages. The feature she's writing may spill a few days over past the formal end of her internship and we're staying aware of that.

  • Metacognition. As Jefferson said, "If men were angels, we would have no need of government." But we're flawed, and so we have to keep up the discipline of metacognition, of figuring out what we are bad at and how to get better. I asked Frances to self-assess her learning styles and have used that information to give her resources and tasks that will suit her. Early in the internship I messed up and suggested a very broad, ill-defined miniproject as a way to learn more about the MediaWiki API; since then I've learned better what to suggest as an initial discovery approach. Halfway into the internship we realized we weren't meeting enough, so we started the daily videochat-or-IRC appointment. I have let Frances know that I can be a bad correspondent so it's fine to nag me, to remind me that she's blocked on something, to ask other mentors for help. And so on. We've learned along the way, about each other and about ourselves. My mom says, "teaching is learning twice," and she's right.

Setting up an internship on a strong foundation makes it a smoother, less stressful, and more joyous experience for everyone. I've heard lots of mentors' stories of bad internships, but I don't think we talk enough about what makes a good internship. Here's what we are doing that works. You?


(P.S. Oh and by the way you can totally hire Frances starting in September!)

August 13, 2014 07:48 PM