en.planet.wikimedia

April 20, 2014

Gerard Meijssen

#Wikidata - its sex ratio

In a perfect world, Wikidata knows the sex for each person where Wikipedia has pertinent information; every Wikipedia. In a perfect world you query Wikidata for the sex ratio of each Wikipedia.

As we know, the world is not perfect; Wikidata currently knows about 1,332,383 "humans"  760,616 are male and 154,455 are female. This makes for 57% males,  12% females and 31% unknowns. Many items still need to be identified as human as well.

With a selection like the 12,800 known Harvard alumni, we find that there are 5,359 males and 840 females. This is 42% male, 7% female and 51% unknown. Before we compiled these numbers missing items were created for each known alumni and all of them were made human and a Harvard alumni as well.

The problem Wikidata faces is not only with the under representation of women, the problem is with the lack of data about the gender of known humans. The nice thing about statistics is that now that we have some numbers, we can track how Wikidata evolves in its information about the sexes.
Thanks,
      GerardM

by Gerard Meijssen (noreply@blogger.com) at April 20, 2014 09:38 PM

April 19, 2014

Gerard Meijssen

#Sources for causes

What to do when #Wikidata tells you there is a problem? Keep calm and cite your sources.

For a year now, people have been poring data in to Wikidata and, there is a lot of it. This data is coming from many places; among them all the Wikipedias. They do not necessarily agree on everything.

One area where it particularly makes sense to cooperate are the recently departed. Many of the people who were notable enough for an article are old and, they die. They die in droves.

As some people are described in several languages, you may find that those other Wikipedians knew about it first. So you may learn about even more deaths in the ranks. Another thing that happens is that people enter a different data... OOPS...

This is where you keep calm and cite your sources. People only die once, so this is the time to be assertive about your sources.

Wikidata is at this time happy when you sort it out, get it right and update its data accordingly. Adding sources is really appreciated but at this time we are mostly happy when you concur that we have the same data.
Thanks,
      GerardM

by Gerard Meijssen (noreply@blogger.com) at April 19, 2014 05:34 PM

Semantic MediaWiki

Semantic MediaWiki 1.9.2 released

Semantic MediaWiki 1.9.2 released

April 18 2014. Semantic MediaWiki 1.9.2, the next minor version after 1.9.1, has now been released. This new version adds a new features e.g. an object ID lookup or making the result querystring available to templates, fixes bugs and brings many stability improvements and enhancements. See the page Installation for details on how to install and upgrade.


This page in other languages: de

Semantic MediaWiki 1.9.2 released en

by Kghbln at April 19, 2014 10:57 AM

Gerard Meijssen

#Wikidata - Heroes of the Soviet Union

On the Russian Wikipedia, there are 10898 entries in the category for heroes of the Soviet Union. Only 9740 of them have a Wikidata item. With the Creator tool it is easy to add the missing 1158 items. It gently adds them one at a time.

Adding statements for over 10.000 heroes is a bit too much for the AutoList tool. There are several edits to make. First, all the people are a human and then they have to receive the recognition in Wikidata for the hero they are.

It is much better to use a bot for this. What clinges it is that many people on the Russian Wikipedia have a template with much more information than just this one award. Things like dates of birth and death, places of birth and death. Other awards they have received..

The Russian Wikipedia is a really rich resource and it will be wonderful when more of its information is reflected in Wikidata.
Thanks,
       GerardM

by Gerard Meijssen (noreply@blogger.com) at April 19, 2014 09:28 AM

#Wikidata - Eli Saslow, Pullitzer prize and George Polk award winner

No #Wikipedia article for Mr Saslow yet though. Some work is done on the George Polk award and, it was found that among many others, Mr Saslow was missing.

To demonstrate the potential for quality of Wikidata, missing winners were added. Some of the issues that were found were:

  • people do not have an article
  • people are not part of the George Polk awards recipients category
  • people do not have a Wikidata item
  • some of the recipients are not people
Mr Saslow is a great example of a person who you would expect to have a Wikipedia article. But given the way the community works, he will get one once someone feels the need to write it.

When journalism and sharing information is important for you, consider this: the Pulitzer Prize for Explanatory Reporting currently has only one recipient.. Mr Saslow. His alma mater has three alumni, two more were added for him not to be alone. His employer, a major quality newspaper, has one employee .. 

But still, the fact that Wikidata does know these things demonstrates that its quality is improving.
Thanks,
      GerardM

by Gerard Meijssen (noreply@blogger.com) at April 19, 2014 06:31 AM

April 18, 2014

Wikimedia Tech Blog

Wikimedia engineering report, March 2014

Major news in March include:

  • an overview of webfonts, and the advantages and challenges of using them on Wikimedia sites;
  • a series of essays written by Google Code-in students who shared their impressions, frustrations and surprises as they discovered the Wikimedia and MediaWiki technical community;
  • Hovercards now available as a Beta feature on all Wikimedia wikis, allowing readers to see a short summary of an article just by hovering a link;
  • a subtle typography change across Wikimedia sites for better readability, consistency and accessibility;
  • a recap of the upgrade and migration of our bug tracking software.

Note: We’re also providing a shorter, simpler and translatable version of this report that does not assume specialized technical knowledge.

Engineering metrics in March:

  • 160 unique committers contributed patchsets of code to MediaWiki.
  • The total number of unresolved commits went from around 1450 to about 1315.
  • About 25 shell requests were processed.

Personnel

Work with us

Are you looking to work for Wikimedia? We have a lot of hiring coming up, and we really love talking to active community members about these roles.

Announcements

  • Chase Pettet joined the Wikimedia Operations Team as as Operations Engineer (announcement).
  • Following changes in the Engineering Community Team, Quim Gil took over as Engineering Community Manager, and Sumana Harihareswara transitioned to the role of Senior Technical Writer (announcement).
  • Kevin LeDuc joined the Wikimedia Foundation as Analytics Product Manager (announcement).

Technical Operations

Datacenter RFP

Final negotiations and coordination are still ongoing for the data center RFP, but we expect to be able to make an announcement soon.

Wikimedia Labs

Labs metrics in March:

  • Number of projects: 149
  • Number of instances: 310
  • Amount of RAM in use (in MBs): 1,288,704
  • Amount of allocated storage (in GBs): 14,925
  • Number of virtual CPUs in use: 635
  • Number of users: 2,907
The Labs Ops team has spent the month shepherding projects from the Tampa cloud to the Ashburn cloud. Dozens of volunteers contributed to the move, and all tools and projects have now been copied to or rebuilt in Ashburn. Some projects and tools are in a non-running state pending action on the part of their owners or admins. Ashburn Labs is running OpenStack Havana, with NFS for shared storage.
The usage stats this month are quite a bit different from last month. Quite a number of obsolete instances have been purged, and last month’s stats may have included some data center duplication.

Tampa data center

During March, the Ops team has been decommissioning and shutting down a lot of hosts in the old Tampa data center, including all former appservers. The amount of energy consumed in the old data center has been greatly reduced. A few hosts are going to be migrated to another floor in the existing data center and physical data center work is coming up.

Features Engineering

Editor retention: Editing tools

VisualEditor

Presentation slides from the VisualEditor team’s quarterly review meeting on 26 March.

In March, the VisualEditor team continued their work on improving the stability and performance of the system, and added some new features and simplifications, helping users edit and create pages more swiftly and easily. Editing templates is now much simpler, moving most of the advanced controls that users don’t often need into a special version of that dialog. The media dialog was improved and stream-lined a little, adding some hinting to the controls to explain how they work a bit more. The cursor entry points inserted by VisualEditor next to items like images or templates to give users somewhere to put the cursor now animate on hover and cursor entry to show that they’re special. The overall design of dialogs and controls was improved a little to make it flow better, like double-clicking a block to open its dialog. A new system for quickly and simply inserting and editing “citations” (references based on templates) neared completion and will be deployed in the coming month. The deployed version of the code was updated four times in the regular releases (1.23-wmf17, 1.23-wmf18, 1.23-wmf19 and 1.23-wmf20).

Parsoid

Presentation slides from the Parsoid team’s quarterly review meeting on March 28

March saw the Parsoid team continuing with a lot of unglamorous bug fixing and tweaking. Media / image handling in particular received a good amount of love, and is now in a much better state than it used to be. In the process, we discovered a lot of edge cases and inconsistent behavior in the PHP parser, and fixed some of those issues there as well.

We wrapped up our mentorship for Be Birchall and Maria Pecana in the Outreach Program for Women. We revamped our round-trip test server interface and fixed some diffing issues in the round-trip test system. Maria wrote a generic logging backend that lets us dynamically map an event stream to any number of logging sinks. A huge step up from our console.error based basic error logging so far.

We also designed and implemented a HTML templating library which combines the correctness and security support of a DOM-based solution with the performance of string-based templating. This is implemented as a compiler from KnockoutJS-compatible HTML syntax to a JSON intermediate representation, and a small and very fast runtime for the JSON representation. The runtime is now also being ported to PHP in order to gauge the performance there as well. It will also be a test bed for further forays into HTML templating for translation messages and eventually wiki content.

Core Features

Flow

This month the Core Features team focused on improvements to how Flow works with key MediaWiki tools and processes. We made changes to the history, watchlist, and recent changes views, adding more context and bringing them more in line with what experienced users expect from these features. We also worked on improvements to the API and links tables integration. On the core discussion side, we released a Flow thank feature, allowing users to thank each other for posts, and began work on a feature to close and summarize discussions. Lastly, we continued work on rewriting the Flow front-end to make it cleaner, faster, and more responsive across a wide number of browsers/devices, which will be ongoing over the next month.

Growth

Growth

In March, the Growth team primarily focused on bug fixing, design enhancements, and refactoring of the GettingStarted and GuidedTour extensions, which were recently launched on 30 Wikipedias. We updated icons and button styles, rewrote the interface copy, and refactored the interface to be more usable in non-English languages. We also began work on a significant refactor of the GuidedTour API, in order to support interactive tours that are non-linear. Non-linear tours will not depend on a page load to run, which will enable better support for tours in VisualEditor, among other things. Last but not least, we made progress on measuring the impact of GettingStarted across all wikis where it is deployed, with results for the first 30 days of editor activity expected in early April.

Support

Wikipedia Education Program

This month, thanks to the work of Facebook Open Academy student JJ Liu, we added a new type of notification for course pages: users are now notified whenever they get added to a course. We also fixed inconsistencies with interface messages, user rights, and the deletion of institutions from the system.

Mobile

Wikimedia Apps

The team worked on logged-out editing to logged-in editing, and table of contents refinements.

Mobile web projects

The team worked on the link inspector for VisualEditor on tablets, and a switch between VisualEditor and wikitext on tablets. Both are in alpha.

Wikipedia Zero

During the last month, with the assistance of the Ops and Platform teams, the Wikipedia Zero team added hosting for the forthcoming Partners Portal and continued work on image size reduction for the mobile web. Additionally, the team added Wikipedia Zero detection to the Wikipedia for Firefox OS app and added contributory features support for users on partner networks supporting zero-rated HTTPS connections. The team also removed search thumbnails for zero.wikipedia.org connections in order to avoid spurious charges on devices supporting high-end JavaScript yet using zero.wikipedia.org. Analytics fields were added for the purpose of counting proxy-based and HTTPS-based connections. Routine pre- and post-launch configuration changes were made to support operator zero-rating, and technical assistance was provided to operators and the partner management team to help add zero-rating. The Wikipedia Zero automation testing server was also migrated. The forthcoming Android and iOS apps were also updated to make Wikipedia Zero detection a standard fixture.

Yuri continued analytics work on SMS/USSD pilot data. Post hoc analysis was performed on WML usage after its deprecation; it is still low, although obtaining more low-end phones to check for how well HTML renders and how to enhance the HTML could be useful. Post hoc analysis was also performed on anomalous declines and growth spurts in log lines (not strictly related to pageviews); in the former it much had to do with API changes and in the latter it had much to do with an external polling mechanisms.

With the assistance of the Apps team, User-Agent, Send App Feedback, and Random features were added to the forthcoming reboots of the Android and iOS apps, while making the Share feature for Android allow for a different target app each time and providing code review assistance on the Android and iOS apps code; proof of concept for fulltext search was started on iOS. Wikipedia for Firefox OS bugfixes were also pushed to production. Screencap workflows and preload information was put together for the Android reboot with respect to Wikipedia Zero as well.

The team worked with Ops on forward planning in light of the extremely infrastructure-oriented nature of the program. Quarterly review as held with the ED, VP of Engineering, and the W0 cross-functional team, and the W0 cross-functional team reviewed presentation material for publication. The team also continued work on additional proxy and gateway support. To help partner tech contacts, the team worked on reformatting the tech partner introductory documentation.

Finally, the team explored proactive MCC/MNC-to-IP address drift correction, and will be emailing the community for input soon.

Wikipedia Zero (partnerships)

Smart, the largest mobile operator in the Philippines, is giving access to Wikipedia free of data charges through the end of April. They announced the promotion in a press release. Ingrid Flores, Wikipedia Zero Partner Manager, visited the Philippines and arranged a meeting with local community members and Smart. They are now exploring ways to collaborate in support of education. The partnerships team kicked off account reviews with the 27 existing Wikipedia Zero partners, to update the implementation, identify opportunities for collaboration in corporate social responsibility (CSR) initiatives and get feedback on the program. The account reviews will continue for the next few months. Last, we continued recruiting for Wikipedia Partner Manager for the Asia region.

Language Engineering

Language tools

MediaWiki’s LocalisationUpdate extension was rewritten by Niklas Laxström to modernize its internal architecture to be able to support JSON message file formats. Kartik Mistry released the team’s monthly MediaWiki Language Extension Bundle (MLEB 2014.3) with the latest version of LocalisationUpdate (see release notes). Niklas Laxström also started migrating the Translate extension’s translation memory and translation search back-end from Solr to ElasticSearch in line with Wikimedia’s search migration. David Chan continued his work on input method support for the VisualEditor project.

Milkshake

Santhosh Thottingal, Kartik Mistry and Niklas Laxström made numerous bugs and performance improvements in jquery.webfonts, jquery.ime and jquery.uls. Amir Aharoni started collecting metrics on usage of Universal Language Selector.

Language Engineering Communications and Outreach

Runa Bhattacharjee and Kartik Mistry set up a manual testing infrastructure using the Test Case Management System (TCMS) to help get greater participation from the volunteer community of software tools and features developed by the team. Volunteer testing is expected to be kickstarted for language software this coming month. The team’s monthly office hour was hosted by Runa Bhattacharjee on March 12. An overview of webfonts with advantages and challenges of using them on Wikimedia sites was also published by the team.

Content translation

Preview of section alignment and basic editing in Content translation

Santhosh Thottingal and David Chan continued development and technology research on the Content Translation project. Development was focused specifically on updates to the side-by-side translation editor and section alignment of translated text. Kartik Mistry and Santhosh Thottingal worked on infrastructure for testing the Content Translation server. David Chan continued his technology research on sentence segmentation.

Pau Giner updated the Content Translation UI design specification incorporating review comments from UX and product reviews. The team also participated in a review of the Content Translation project with the product team leadership.

Platform Engineering

MediaWiki Core

HHVM

The team continued to work on porting C extensions to HHVM. Tim Starling did major work on a compatibility layer allowing Zend extensions to be used by HHVM, and started further work on making the layer compatible with newer HHVM interfaces. The team has made a preliminary deployment of HHVM to the Beta cluster, but this still needs further debugging before it is useful to a wider audience.

Release & QA

The Beta Cluster has been migrated from the Tampa data center to the Ashburn data center. In the move, a ton of cleanup and Puppetization work was done. This will make future Beta Cluster work easier. In addition, the Beta Cluster is getting closer to a place where we can test our current main deployment tool known as “scap” along with future/other deployment tools. The team continued on the rewrite of scap into python (from Bash scripts + PHP), improving both performance and maintainability in addition to being in a better position to move to a new tool in the future. We have also started doing SWAT deploy windows twice a day (Monday to Thursday) which has greatly increased momentum for many developers who would otherwise have to wait until the weekly deployment cycle.

Search

In March we upgraded to the newest version of Elasticsearch and expanded onto more wikis. We also started a performance assessment which has started showing us the work required to use Cirrus as the primary search back-end for the larger wikis. We then started in on that work.

Auth systems

The team prepared the migration of the central OAuth database from mediawiki.org to Meta-Wiki, and got input from the Wikimedia Foundation’s legal team regarding the OAuth process.

Wikimania Scholarships app

Support of production application during applicant review period continued in March. A dataset of applicants passing the phase 1 review criteria who had opted-in to sharing application details with chapters and thematic organizations was prepared and delivered to Foundation staff. The beta testing server was migrated from the Tampa data center to the Ashburn data center as a component of the Labs environment migration. The new beta server in Labs is now managed via the MediaWiki-Vagrant role::wikimania_scholarships puppet role and labs-vagrant. This should make keeping development changes and the testing application in sync easier in the future.

Security auditing and response

MediaWiki 1.19.13, 1.22.5, 1.21.8 and 1.19.14 were released for security issues. An internal security training session was held for Wikimedia Foundation staff.

Quality assurance

Quality Assurance

The QA team continues to identify and report issues in a timely way. Of particular interest in March was that an automated test uncovered an issue in the interaction of the MobileFrontend and VisualEditor extensions. This is exactly the kind of cross-cutting concern that our QA systems are designed to uncover. It is likely that we will be in a position to discuss these systems at the Wikimania conference in London.

Beta cluster

There was a substantial effort to migrate the Beta cluster over from the Tampa data center to the Ashburn, VA (“eqiad”) data center. This was led by Antoine Musso with assistance from Bryan Davis and many others.

Continuous integration

Erik Bernhardson and Chad Horohoe managed to create jobs using Jenkins Job Builder (JJB) based on the tutorials on installing JJB and Adding a MediaWiki extension.

Browser testing

Besides a particular focus on MobileFrontend browser tests in March, we have also made available some new features, in particular shared code to upload files properly in all browsers, the ability to check for ResourceLoader problems in any test in any repository, and a basic wrapper in order to use the Mediawiki API from within browser tests to set up and tear down test data.

Multimedia

Multimedia

Slides for the Multimedia Quarterly Review Meeting for Q3 2013-14.

In March, the multimedia team’s main project was Media Viewer v0.2, as we completed final features for the tool’s upcoming release next quarter. Gilles Dubuc, Mark Holmquist, Gergő Tisza and Aaron Arcos developed a number of new features, including: share, embed, download, opt-out preference,file page link and feedback link, based on designs by Pau Giner. We invite you to test the latest version (see the testing tips) and share your feedback.

Fabrice Florin coached the multimedia team as product manager and hosted several planning and review meetings, including a cycle planning meeting (leading to the next cycle plan) and the Multimedia Quarterly Review Meeting for the first quarter of 2014, which summarizes our progress and next steps for coming work (see slides). He also worked with Keegan Peterzell to engage community members for the gradual release of Media Viewer, to be enabled by default on a number of pilot sites next month, then deployed widely to all wikis a few weeks later. For more updates about our multimedia work, we invite you to join the multimedia mailing list.

Engineering Community Team

Bug management

Beside working on the Project Management Tools Review, Andre Klapper retriaged many older tickets with high priority set for >2years, older PATCH_TO_REVIEW tickets and older critical tickets and investigated moving the Bugzilla instance on Wikimedia Labs to the Ashburn data center (easier to set up from scratch). Andre added project-specific sections and Bugzilla queries to Annoying little bugs to help newcomers finding an area of interest for contributing, and blogged about the 4.4 upgrade (which took place in February) and moving Bugzilla to a new server. In Bugzilla’s tickets, all remaining Cortado tickets were closed and new Versions for “Wikipedia App” product set up.

Project management tools review

Guillaume Paumier and Andre Klapper reached to the teampractices and wikitech-l mailing lists in order to shorten the list of options that can come out of this review process. They also hosted a lively IRC office hour to give an overview of the current situation, answer questions and discuss the first version of the related RFC.

Mentorship programs

The six ongoing FOSS Outreach Program for Women were completed successfully, setting a new benchmark for success in our outreach programs. Check the results:

We received 43 Google Summer of Code proposals from 42 candidates, and 18 FOSS Outreach Program for Women proposals from 18 candidates. Dozens of mentors are pushing the selection process that will conclude on April 21 with the announcement of selected participants.

Technical communications

In addition to ongoing communications support for the engineering staff, and contributing to the technical newsletter, Guillaume Paumier edited and published a series of essays on the Wikimedia Tech blog written by Google Code-in students, who shared their impressions, frustrations and surprises as they discovered the Wikimedia and MediaWiki technical community.

Volunteer coordination and outreach

The bulk of work to create community metrics around five Key Progress Indicators is completed, and now we are polishing help strings and usability details. The next step is to share the news with the community and start looking at bottlenecks and actions. Check:

A page about Upstream projects was drafted collaboratively in order to start mapping the key communities where we Wikimedia should be active, either as contributor / stakeholder, or promoting our own tools. We helped selecting participants sponsored to travel to the Zürich Hackathon 2014 in May.

Architecture and Requests for comment process

We held four RfC review meetings on IRC:

Analytics

Kraken

We reached a milestone in our ability to deploy Java applications at the Foundation this month when we stood up an Archiva build artifact repository. This enables us to consistently deploy Java libraries and applications and will be used in Hadoop and Search initially.

The first Analytics use case for this system will be Camus, Linked-In’s open source application for loading Kafka data into Hadoop. Once this is productized, we’ll have the ability to regularly load log data from our servers into Hadoop for processing and analysis.

Wikimetrics

We did some significant architectural work on WikiMetrics this month to prepare it for its role as our recurrent report scheduling and generation system. The first use case for this system will be the Editor Engagement Vital Signs project, which will provide daily updates on key metrics around participation.

Kafka

We continue to investigate network issues between our data centers that are causing occasionally delivery issues. As noted above, we are currently deploying Camus, our software for transferring data between Kafka and Hadoop.

Data Quality

We fixed a number of issues around data quality in Wikistats, Wikipedia Zero and Wikimetrics.

Research and Data

Video of the March session of the Research and Data monthly showcase.

This month we concluded the first stage of work on metrics standardization. We created an overview of the project with a timeline and a list of milestones and deliverables. We also gave an update on metrics standardization during the March session of the Research and Data monthly showcase. The showcase also hosted a presentation by Aaron Halfaker on his research on the impact of quality control mechanisms on the growth of Wikipedia.

We published an extensive report from a session we hosted at CSCW ’14 on Wikipedia research, discussing with academic researchers and students how to work with researchers at the Foundation.

We submitted 8 session proposals for Wikimania ’14, authored or co-authored by members of the research team.

We attended the Analytics team’s Q3 quarterly review during which we presented the work performed by the team in the past quarter and our goals for the upcoming quarter (April-June 2014).

We completed the handover of Fundraising analytics tools and knowledge transfer in preparation for a new full-time research position that we will be opening shortly to support the Fundraising team.

We continued to provide support to teams in focus area (Growth and Mobile) with an analysis of the impact of the rollout of the new onboarding workflows across multiple wikis; an analysis of mobile browsing sessions and ongoing analysis of mobile user acquisition tests. We also supported the Ops team in measuring the impact of the deployment of the ULSFO cluster, which provides caching for West USA and East Asia.

Kiwix

The Kiwix project is funded and executed by Wikimedia CH.

This month, we released a new version of Kiwix for Android that adds support for older versions of Android like Gingerbread; about 50% more devices than before are now supported.

Wikidata

The Wikidata project is funded and executed by Wikimedia Deutschland.

The team worked on making ranks more useful. From now on, by default the property parser function and Lua always return the values with the “preferred” rank or, when none is available, the one with the “normal” rank. This allows for example to exclude past mayors when asking Wikidata for the mayor of a city. Additionally, considerable speed improvements have been made; browsing Wikidata is now a lot faster. Diffs between versions of pages on Wikidata have also been improved to make it easier to see what changes were made to an item. Last but not least, the user interface redesign research went on.

Future

The engineering management team continues to update the Deployments page weekly, providing up-to-date information on the upcoming deployments to Wikimedia sites, as well as the annual goals, listing ongoing and future Wikimedia engineering efforts.

This article was written collaboratively by Wikimedia engineers and managers. See revision history and associated status pages. A wiki version is also available.

by Guillaume Paumier at April 18, 2014 12:37 PM

Gerard Meijssen

#Wikidata - awards and politics

Edward Snowden received the Ridenhour Truth-Telling Prize. For some Mr Snowden and the Ridenhour prize may be controversial. However, it does not mean that they are irrelevant or not notable. The award has been added to Wikidata together with many of its recipients.

Several of the recipients do not have a Wikipedia article and that is fine.. This may change. The Ridenhour prize was named after a Mr Ridenhour. He was a journalist and he received the George Polk Prize. This was not obvious because on the article there was no reference to the category. This has been remedied.

The world is an imperfect place and we can improve it by cherishing the people who matter. By stating the obvious, by sharing in the sum of all knowledge.
Thanks,
      GerardM

PS This is the George Polk Award Recipients category and, this is its Reasonator entry.

by Gerard Meijssen (noreply@blogger.com) at April 18, 2014 10:58 AM

#Reasonator - #Taxonomy, picture this

When you are are on a train, a bit bored, it helps when your railway company provides you with free Wifi, mine does. It is the perfect setting to add pictures of species to Wikidata. To do this the latest tool by Magnus is really good.

The "Wikidata species images on Commons" is the perfect companion for those idle moments. All you do is look at pictures and decide what pictures shows off a species best. When you do not like any of them, that is fine too. You also have the option to add range maps for a species.

The result of all this can be experienced in the Reasonator.
Thanks,
      GerardM

by Gerard Meijssen (noreply@blogger.com) at April 18, 2014 06:51 AM

April 17, 2014

Gerard Meijssen

Baglama - #statistics that make a difference, that demonstrate impact

Arguably the #GLAM partners of the #Wikimedia foundation are its most relevant partners. They share through us a wealth of imagery and data. Among other things they help us illustrate our articles.

What they provide us with is high in numbers and high in quality. It is dropped in Commons and, it is then for the Wikimedia communities to make use of this wealth.

Baglama, one of Magnus's finest tools, has had a refresh and it shows again what impact any given collection has. It shows the page views they generate. This information is vital; how else can you prove that providing us with freely licensed media files makes a difference, gains a public? What better way to convince that we make a difference?

The Tropenmuseum for instance has much of its impact in Indonesia. Its collection is used more on the Indonesian than on the Dutch Wikipedia. 

What the Tropenmuseum collection proves is that Western museums can have a big international relevance. When they collaborate with us, they have an impact in the countries where their collection originated. This message is as relevant to the Wikimedia communities as it is to the GLAM institutions. We could and should care more about those collections that are not "local". Collections that are typically underfunded.
Thanks,
          GerardM

by Gerard Meijssen (noreply@blogger.com) at April 17, 2014 10:54 AM

#Wikimedia #Commons - Mr Tadeusz Łobos

The picture that illustrates the article about Mr  Łobos is licensed under the CC-by-sa license. Consequently it is a picture that should be available for use in Wikidata. It is not because it was not uploaded to Commons.

When a picture like this one is to be made generally available, it has to be uploaded again to Commons. It is a hassle and another round of bureaucracy will decide if this picture may remain at Commons.

There are many pictures that should be generally available and are not. In a similar way there are many pictures that are considered to be not generally available and as a consequence are no longer available at all.

All media files are equal in that they are in need of the same quality of meta-data and they are all equal in that their copyright and license situation is the same never mind where they have been uploaded.

The consequence is that the scope of a Wikidata approach to media files should not restrict itself to Commons.
Thanks,
       GerardM

by Gerard Meijssen (noreply@blogger.com) at April 17, 2014 09:09 AM

Wikimedia Foundation

Luis Villa: “I wanted to be an Internet lawyer”

Around legal circles, the Wikimedia Foundation is often seen as a curiosity. With a fraction of the staff of other top ten websites, the Foundation arguably does more with less. The core of this complex apparatus consists of two indispensable parts − a strong volunteer community and an equally dedicated legal staff.

Luis Villa

As deputy general counsel, Luis Villa is at the forefront of this eclectic mix that combines traditional legal counsel with community advocacy that stretches across 700+ communities. With a year under his belt at the Wikimedia Foundation, he feels that he’s doing what he always wanted to do. “Out of law school I told someone at my summer job that I wanted to be an Internet lawyer,” says Villa. “He basically said there’s no such thing, but now I have that job!”

Luis’ interest in law and technology go as far back as high school, recalling the United States vs. Microsoft court proceedings as a moment that ignited a curiosity in him for politics and technology. Embracing his passions, he pursued a degree in Political Science and Computer Science at Duke University. “When I started studying computer science and political science in 1996, those were two separate things,” Villa explains. “I was interested in political philosophy and I was interested in computers and I didn’t really think the two had much overlap.” It wasn’t until he read Lawrence Lessig’sCode and other Laws of Cyberspace” that he realized how much overlap there was between the two.

His first job was in quality assurance for Ximian, scoping out bugs and figuring out why things were crashing. While at Ximain he worked extensively on the GNOME open source project doing quality assurance − eventually becoming a board member. He went on to work at the Berkman Center for Internet and Society as a “geek in residence” at Harvard. After a comprehensive search into a variety of institutions with a strong intellectual law faculty, he enrolled at Columbia Law School, graduating in 2009. Before working at a law firm, he spent a year at Mozilla, leading the project to revise the Mozilla Public License. Luis later joined Greenberg Traurig, participating heavily in the Google Oracle lawsuit. While at Greenberg he became an outside counsel for the Wikimedia Foundation. With a background well tailored to the Foundation’s goals and needs, Luis eventually made the decision to join the Foundation full-time as deputy general counsel.

Luis quickly found himself doing much of what he was already used to − but with the twist he was looking for. The foundation’s legal department largely focuses on taking the right amount of time to consult with the widest possible community of editors and users, as well as developing plain language policies that equally benefit and support the Foundation and volunteers. “For the most part the work I was doing in past companies was similar to what I do at the Wikimedia Foundation, but what stands out the most is the diversity of problems we see at WMF,” says Villa. “Open source in the Mozilla community is easily quantifiable. You can talk about the Mozilla community, or the GNOME community easily, but the Wikimedia community is easily 700 to 800 communities. That took a while for me to get my head around.”

This constant interaction with the community is truly the core of the foundation. Policies are built in an open space where anyone can share ideas. The idea is to develop policies that protect the Foundation, and protect editors and readers. Arguably, the Foundation does more than pretty much any other major web property to protect user privacy.

Despite the staff’s strong background and undeniable dedication, Luis makes it clear that it is the fundamental principles of the community that truly keeps the Foundation running. “Every member of this team has a strong background and are quite simply really sharp lawyers, but that’s just step one. Step zero is the community − the community is so dedicated at getting this stuff right, that we only get the most complex and hardest questions.”

The community acts like a sort of filter, explains Luis − essentially picking and choosing the problems that merit further attention. “I don’t think there’s anyone in the community who thinks of themselves as a person who fights libel, but by simply saying ‘citation needed,’ that person is essentially helping our legal team by making biographies of living persons fact based, for example.”

The Foundation’s international community provides a broad and interesting spectrum of responsibilities. When asked what shocked him the most when he first started working at WMF, Luis answered, “The breadth of issues that we face as a legal team. We handle cutting edge free speech issues, copyright issues, complex privacy issues, product counsel for the technology teams, it’s really all over the place.”

The reality is that most outlets don’t perceive lawyers to practice the same values of the Foundation. The view is inverted. They see lawyers as closed off professionals who do something mystical. “At WMF you get to be a real internet lawyer dealing with hard legal problems − and you also get to sleep at night,” explains Luis. This seems to be at the core of what motivates him and others like him. As an example, he mentions last year’s case involving a French editor and the French government, where Wikimedia’s legal team helped defend the editor, as a situation where “we stood up for the right values” and protected both the volunteer and the Foundation.

The future of the Foundation is bright, says Luis. “The Foundation is building the capacity to do more interesting things − for a while it was more about how to keep the site running, and now it’s more about how to be proactive.” The Foundation continues to supply the community with the right tools to flourish in a safe way. Whether it’s privacy and trademark policy or first amendment rights, Luis finds himself exactly where he wants to be. “This is the first time in many years where I am not at all worried about what I’m going to do next — I’m so in the moment.”

Carlos Monterrey, Communications Associate for the Wikimedia Foundation

by Carlos Monterrey at April 17, 2014 01:21 AM

April 16, 2014

Gerard Meijssen

#Reasonator - Greene County, New York

Greene County is just another county in just another US state. In it you find towns, villages and "census designated places". The first line of contact for all kinds of administrative issues is the county.

For a long time, it was the state who was considered the right level of "is in the administrative-territorial entity" in this case it is New York. This has been remedied and at the same time many other villages, towns and whatnot have been associated with the lowest level of an administrative-territorial entity they have to deal with.

When you reason like the Reasonator, you can go up the food chain and find the state, the country a place like Climax has to deal with. It is just one way the Reasonator makes information for you out of the Wikidata data.
Thanks,
       GerardM

by Gerard Meijssen (noreply@blogger.com) at April 16, 2014 05:09 PM

David Gerard

Gerard Meijssen

#Wikidata - an old soldier from Russia

My interest in Mr Styrov is that he died in 2014, on January 26 to be precise. In the info-box on the article that is dedicated to him, it is just one of several facts. Among them there is the rank of Rear Admiral he held in the Soviet Navy and the many awards he received. Among them is the order of Lenin.

With Mr Styrov added as someone who died in 2014, we are closer to the point where Wikidata knows all the people. When we are, we can make a report that will signal the latest people who are known to have died in the last year.

This can be of interest to people who want to know things like:
  • Do we know that this person who has an article on our Wikipedia has died
  • Fellow country men who died and do not have an article on our Wikipedia
  • What person are notable enough to have an article on our Wikipedia
When such reports become available, the data in Wikidata gains a purpose. Some may see it as getting Wikipedia to use the data by the back door, but isn't it there to be used?
Thanks,
      GerardM

by Gerard Meijssen (noreply@blogger.com) at April 16, 2014 11:41 AM

Priyanka Nag

Blogging is simple...

In the last few days, I have got several requests from different people, asking me to suggest them how they should start blogging and give them some tips on blogging. 

On one hand, it makes me feel proud to think that my blogs are encouraging others to become a blogger, but at the same time, it increases my responsibility towards my readers.
These questions actually did force me to put on my thinking cap and ponder over the real secrets of being a good blogger. Is blogging really an art?

Photo source: http://www.gabrielweinberg.com/blog/2011/08/why-i-blog.html

For me, blogging is as simple as writing your journal. You just need to put your thoughts into words.

If I had to jot down a few important tips on blogging, these would probably be the ones:
  • Know your readers. I am sure while making a blog post, most of us have a target audience in mind. Understanding the reader's perspective is important. Just as a speaker needs to strike the right strings in his or her audience, a writer also needs to do the same.
  • Keep your blog simple. There is not much need to put in too many flowery or high technical words (unless there is an absolute need or requirement for it). A simple blog is easier to read and understand.
  • Make your blog interesting to read. Unless you are writing a completely technical blog, there is no harm in putting in a few light jokes here and there. [Just a word of caution here, let the jokes not be at the cost of anyone's sentiments.]
  • If you have a very techy blog post, you can always add a few screen casts or screen-shots here and there.
    If you are writing a blog on your travel experience etc, adding a few pictures is always fun and interesting.
I do not hold more experience than this about blogging....so someone seeking an answer to this question can search a little more for available stuff on the internet.

Also, it would be great if some of my readers can leave their views on this topic as a comment on this post.

by priyanka nag (noreply@blogger.com) at April 16, 2014 10:15 AM

Gerard Meijssen

#WMhack - last years demo

At last years hackathon, one highlight was the map showing the history of Islamic states. It makes clever use of Wikipedia and it makes information available in Arabic and English.

This year the hackathon will be in Zurich and one of the main subjects are maps and how it relates to Wikidata. The history of Islamic states is relevant in all languages and it would be cool when we can update this application to make use of Wikidata.

In this map,  all the different states have an overlay. There must be a map for each state at each interval. This app shows the different rulers at a time.

When we have such overlays available to us, we can do more than show the rulers, the centres of power. We could show the battles, the wars, the conquests. They happen in between the changes of the maps.

Maps that show areas grow and wane in time are another area where Wikidata can be a real help if only because it links to information in so many languages.
Thanks,
      GerardM

by Gerard Meijssen (noreply@blogger.com) at April 16, 2014 09:35 AM

#Wikidata, #maps and #sprites

Maps are static but many maps define something that is very much alive. Take the "raid on the Medway". It shows two flotillas, it indicates how they move. It does not show anything happening of the other side of this battle. It does not show positions.

A sprite is a two-dimensional image or animation that is integrated into a larger scene. It can be put on a map. Picture this, the same map of the raid on the Medway and two little sprites moving in time along the indicated lines.

Technically it is not much of a challenge. It becomes interesting when it starts moving on a real map, an OpenStreetMap for instance. Add the moments when we have images that show scenes of a battle, an occurrence and we get something that becomes relevant.

Technically it seems doable. It seems like a challenge that brings together the parts that already exist. It will be really exciting when it brings Wikidata, Commons, OpenStreetMap together. It will help us explain about events. It is part of sharing the sum of all knowledge.
Thanks,
       GerardM

by Gerard Meijssen (noreply@blogger.com) at April 16, 2014 08:02 AM

April 15, 2014

Wikimedia Foundation

Katherine Maher joins the Wikimedia Foundation as Chief Communications Officer

Katherine Maher

We’re happy to announce that Katherine Maher has joined the Wikimedia Foundation as Chief Communications Officer. She officially stepped into her new role as head of WMF communications on April 14, reporting to the Executive Director.

In her role as CCO, Katherine will work to ensure fast, easy information flow about Wikimedia in multiple languages, both internally within the movement and outside of it. She’ll also work to provide vital communications support to WMF’s various departments and programs, as well as the broader Wikimedia movement.

Katherine comes to us from Washington D.C., where she was most recently Advocacy Director for Access, a global digital rights organization. At Access, she was responsible for media and communications, including communications between the organization and its 350,000 members. She handled urgent global threats to digital rights and participated in the organization’s strategic planning. In addition, she was deeply involved with the production of RightsCon—a conference series convening key stakeholders and influential voices on the issue of preserving a free and open internet that supports digital rights and free expression.

Katherine’s experiences advocating for the rights of ordinary internet users and engaging with a large global community make her an exceptional fit for this new role. We are thrilled to have her aboard.

Sue Gardner, Executive Director of the Wikimedia Foundation

by Sue Gardner at April 15, 2014 07:22 PM

Wikimedia UK

A report from the EduWiki Conference in Serbia

<meta content="text/html; charset=utf-8" http-equiv="Content-Type"/>

The photo shows a group of around 15 people gathered together in an office space

Selection of the participants at the Eduwiki Serbia learning day event

In a post entitled Preparing for the Wikimedia Serbia EduWiki Conference published on this blog on 20 February Brian Kelly described how he would attend the Eduwiki Serbia conference and learning day and report on educational developments taking place in the UK. This post provides his reflections on the events.


Background

The Eduwiki Belgrade conference was organised by Wikimedia Serbia and held at the Belgrade Youth Centre on Monday 24 March 2014. The conference provided an opportunity for sharing of experiences of educational use of Wikipedia in Serbia which was complemented by summaries of similar activities in the US, UK, Germany, the Czech Republic and the Ukraine. Prior to the conference a learning day event was held in the Wikimedia Serbia offices.

The Learning Day

The Learning Day event provided an opportunity for Wikimedia Serbia staff to outline education activities taking place in Serbia and receive feedback from those working or involved with other national Wikimedia chapters (the Wikimedia Foundation and chapters in Germany, the Czech Republic, the Ukraine, Macedonia and the UK).

The learning day was structured so that feedback was provided for a number of areas, which helped to provide focussed attention and helped to ensure that the day was valuable for all participants. The topics covered were project metrics; leadership; target groups; quality and quality of articles; attracting new editors; feedback on the educational projects and opportunities for cooperation across Wikimedia chapters.

As can be seen from the accompanying photograph of a slide which summarised plans for the future, the Wikimedia Serbia organisation is ambitious, with the intention that “in 3-5 years Wikipedia [will be] a part of the Serbian educational system“.

The Eduwiki Conference

The Edukwiki conference provided a series of presentations about Wikipedia and related activities. Following the welcome to the conference from Rod Dunican, Wikimedia Foundation and members of Wikimedia Serbia the morning session provided an overview of Wikipedia, details of the education programme and examples of the educational projects which are taking place in schools and colleges. The morning session also included presentations on Creative Commons and open access. The afternoon session provided details of activities taking place beyond Serbia. Following an overview of the Wikimedia Education Programme given by Rid Dunican, Director of Global Education Programs at the Wikimedia Foundation, details of national activities were provided from speakers from Germany, the Czech Republic, the Ukraine and myself who summarised activities in the UK.

I had previously written a blog post on Open Education and Wikipedia: Developments in the UK which went into some detail of some of the key activities I would describe in my presentation: highlights from the EduWiki UK 2013 conference, the Jisc Wikimedia Ambassador post and the forthcoming Wikimania conference, to be held in London in August 2014. However after I submitted my slides I discovered that I would only have 15 minutes for my presentation, rather than the 45 minutes which the layout of the conference timetable suggested! I was able to provide an edited summary of my slides (which are available on Wikimedia Commons) although the original slides are still available and are hosted on Slideshare.

Reflections

The Eduwiki Serbia conference only attracted small numbers of participants, many of whom were speakers at the event. It would seem that the value of Wikipedia in education is not yet being appreciated beyond the early adopters. It seems to me, therefore, that there is a need to explore outreach strategies which go beyond the early adopters and appeal to the early mainstream community who may be willing to make use of Wikipedia if they see benefits for their mainstream activities.

Such approaches may require use of communications and outreach channels which go beyond use of mailing lists, blogs and wiki resources which are managed by Wikimedia chapters. I found it interesting to observe how Wikipedia Serbia has a Facebook page and makes use of this Facebook page for its outreach activities, with 678 current ‘likes’ of the page. Might monitoring metrics of social media uses by Wikimedia chapters provide useful insights into potentially valuable outreach channels., I wonder?

Further Information

A large number (currently over 130) of photographs about the EduWiki Conference Belgrade 2014 have been uploaded to Wikimedia Commons with an additional 110 photographs about the Learning Day also available.

I also created a Storify summary of the two events. I decided to do this after making use of Storify to provide a report on the WIKIsymposium which took place in the University of Stirling a few days before the events in Belgrade. As I described in a blog post on Emerging Best Practices for Using Storify For Archiving Event Tweets Twitter has the potential to enable discussions and ideas shared at events to be made available with a wider community and if there are enough people tweeting at an event a useful summary of the event can be produced. Perhaps this might be a useful approach for raising the visibility of Wikipedia events within the Twitter community? I’d welcome your thoughts.

 

by Stevie Benton at April 15, 2014 11:20 AM

Wikimedia Tech Blog

Agile and Trello: The planning cycle

This blog series will focus on how the Wikipedia App Team uses Trello for their day to day, week to week, and sprint to sprint development cycle. In its early days, the Wikipedia App was an experimental project that needed flexibility for its evolution. The team looked at a number of tools like Bugzilla, Mingle, and Trello to wrangle our ever-growing to-do list. We found that most imposed a structure that was stifling rather than empowering, cumbersome rather than fun, and was generally overkill for what we needed.

Trello looked attractive as it took no more than a couple of minutes to see its moving parts, was available on multiple platforms, and was simple to customize. We experimented with it and quickly found that we could make it do most of what we wanted.

For those unfamiliar with Trello, it’s a list of lists at its basic level and it functions incredibly well within an Agile framework. Trello uses the concepts of boards, lists, items, and subitems. Boards contain lists which contain items which in turn contain subitems.

Here is how we use it:

Each idea starts out as a narrative or user story on our backlog board. Most of our stories are written in a “As a …, I want to …, So that …” format. This allows us to have a narrative justification for a unit of work rather than a list of technical requirements. Stories begin their life in the “In analysis” column where the product manager (who acts as the product owner) vets the idea with other stakeholders, involves the Design team, and generally incubates the story. Anyone is welcome to add a story to this column.

When the product owner feels that a story has matured enough, they place it in the “ready for prioritization” column with any required design assets. As these stories increase in number, we begin to see the next sprint forming.

Within a couple of days, the team meets and the product manager discusses the theme of the upcoming sprint. A new sprint board is created and the product manager moves the most important 3−5 stories for a deeper analysis by the whole team. The team meets and collectively refines the story cards to have a clear set of acceptance criteria under the checklist column, flags stories that need additional design, and prioritizes them in top down order.

Within a week’s time, the team meets again, but this time their goal is to estimate and do a final pass on each story card. We use a combination of Scrum for Trello and hat.jit.su to facilitate the estimation process. Once all stories have been estimated, the product manager re-prioritizes, checks against our sprint velocity, and the sprint is ready to start.

Thus at any point we have three active boards:

  • Backlog – where all stories start
  • Current Sprint – what developers are working on
  • Next Sprint – what’s coming up next

Next time we’ll see what happens from the developers’ standpoint during a sprint.

Tomasz Finc, Director of Mobile

by Tomasz Finc at April 15, 2014 08:29 AM

Gerard Meijssen

#Wikidata - The dead at the #Arabic #Wikipedia

As the #quality of Wikidata can be measured, it is important to be inclusive when what is measured are the people who died in 2014. People die in every country like this gentleman from Saudi Arabia.

According to the article, he died Thursday, 12 May 1435 AH, that is the same as 13 March 2014. Google translate transliterates the title as "Zaid bin Mohammed portal". That is enough reason not to use the transliteration as the title for the item.

When you read the article it is about Mr bin Mohammed and does not give the impression of a portal page.. That is for someone who understands the Arabic Wikipedia to fix.
Thanks,
     GerardM

by Gerard Meijssen (noreply@blogger.com) at April 15, 2014 05:14 AM

Wikimedia Foundation

In memoriam of Cynthia Ashley-Nelson

Cynthia Ashley-Nelson

Cynthia Ashley-Nelson passed away Friday, April 11th. She was attending the Wikimedia Conference in Berlin as an AffCom member, and on Thursday had participated on her first annual AffCom meeting. The news about her death has surprised and shocked the people at the conference. I realize there are many people who might not be familiar with her, so I wanted to write a few words about the impact she made on those who knew her.

In my role as Board liaison to the Affiliations Committee, I had seen Cindy, as her friends called her, apply to become a member – and ultimately elected to the committee. She had such a solid background, so relevant to the work AffCom does, she was such a strong candidate, it was a no brainer for AffCom to elect her. They were not disappointed. Cindy was participative, incredibly engaged from day one, always looking ahead and trying to improve existing processes and expand AffCom’s role. She had wonderful ideas and a refreshing perspective regarding movement roles and the role of AffCom. One that I especially liked was her desire to implement a thorough Affiliate Development Program, to help guide new affiliates and teach them relevant skills so they could not only be better equipped to survive, but to thrive and have a bigger impact in a shorter period of time.

I got to know Cindy a bit beyond that, for she wanted to test ideas and potential directions in which to take the movement. We would send each other long emails about movement roles and how to move forward with the movement. And as it usually happens, conversations turned from the more formal to the informal, eventually including little snippets of our every day lives, the good things that happened to us and the not so good. When we met for the first time face to face several days ago, we gave each other a big hug. In the session we had during the AffCom meeting she once again showed her passion and commitment to help re-imagine the role of AffCom and how to help new affiliates. At the end of that session, she was confirmed as the new vice-chair of AffCom. That speaks to the impact she made on the committee in such a short time. I think our last interaction was about getting together at some moment during the conference to just hang out and talk. She had a great smile.

As far as we know, Cindy died peacefully and in her sleep. When the tragic news came in on Friday night during dinner, so out of the blue, I was shocked. Literally shocked. She had missed the meeting between AffCom and the Board, which was very surprising, and it hadn’t been possible to contact her, but it didn’t necessarily make one think something bad had occurred. When the Board was notified of what had happened, we wanted to be very respectful of the fact that the priority had to be to contact the next of kin before any kind of public announcement was released. But AffCom had to be told. I had been an AffCom member before joining the Board. Breaking the news to them was one of the hardest things I’ve ever done. We went to a room to deal with the shock and the reactions. Nobody wanted to be alone.

This morning after the next of kin had been located and notified, we all got together for breakfast and went together to the venue where a grief counsellor was available. There was a brief but touching tribute at the beginning of the conference. AffCom then prepared a public statement about Cindy’s death. I felt my place was with them, helping them word it. As the schedule was reorganized, I missed the Meet the Board session which was moved to the morning, which I deeply regret, but I did want to be with AffCom in these moments. I want people to know I will be available for anyone who wants to ask me anything about the Board or the movement at the venue. I just couldn’t make it that morning. Before ending this post, I would like to take a moment to thank the people of WMDE, who were incredible in such difficult circumstances and who set up a special room to grieve for her and write in a book of condolences, particularly Pavel, and WMF staff, especially Anasuya, Garfield and Asaf. The support of Board members was deeply appreciated as well, not only by me but by AffCom as well.

This post is perhaps a bit cathartic for me. Cindy, you made an impact in those who knew you and you will be remembered. My thoughts are with the family and friends. Rest in peace.

María Sefidari, WMF Board of Trustees member

  • See Cynthia’s user page on English Wikipedia.
  • Wikimedians have begun to share their memories and condolences about Cynthia on her user talk page.
  • Memorial post by Asaf Bartov, Head of WMF Grants and Global South Partnerships.
  • Announcement by Carlos Colina and Bence Damokos from the Wikimedia Affiliations Committee
  • Wikinews story on the passing of Cynthia Ashley-Nelson.

by Maria Sefidari at April 15, 2014 01:31 AM

April 14, 2014

Pete Forsyth, Wiki Strategies

Five things a Wikipedian in Residence can do

<style type="text/css"></style>

Are you a Wikipedian? Do you want to help a museum, a library, a university, or other organization explore ways to engage with Wikipedia? Great – you should offer your expertise as a Wikipedian in Residence!

If you find yourself in such a role, you will have opportunities to help your host organization contribute to the sharing of knowledge in new and exciting ways; and to help Wikipedia readers and editors around the world benefit directly from the expertise and institutional knowledge your host possesses. Ideally, your role is that of a connector and a facilitator; you should aim to empower those around you (both the staff of your host organization, and Wikipedia volunteers who share the organization’s interests).

So what can you do to get off to a good start? Below are a few ideas, drawn from past Wikipedian in Residence programs. (It may also be helpful to review the assessment of a program that many felt was not planned effectively: Assessment of Belfer Center Wikipedian in Residence program)

1. Chat it up on Wikipedia!

A great Wikipedian in Residence convenes discussion among Wikipedians and the host organization's people, both online and in person. ©Lettera 27, licensed CC BY-SA 3.0.

A great Wikipedian in Residence will convene discussion among Wikipedians and the host organization’s people, both online and in person. ©Lettera 27, CC BY-SA 3.0.

Wikipedia’s talk pages can be drama machines – or they can be ghostly silent. But when all is going well, they can be incredible forums for processing complex information, and determining the best way to clearly and neutrally guide a reader’s learning process.

What makes discussions on Wikipedia work well? It can help to have an expert or two around, but even more vital is relaxed, friendly, and informal facilitation. One or more people who have a clear commitment to improving the article and developing consensus can make a tremendous difference, just by keeping the conversation going and drawing attention back to the important questions. What better role for a Wikipedian in Residence?

You should begin your Residency with a commitment to working openly, in whatever way best fits your project. Explain your substantial edits and additions on the article’s talk page; or better yet, at a relevant WikiProject. If somebody’s work catches your interest, let them know on their user talk page. Get to know the volunteers who are already passionate about your topic, and help them work together more effectively.

To establish context for anybody who might be interested in your work, be sure to create a “project page” on Wikipedia covering the goals and activities of your residency. See this example: Project page from the Children’s Museum of Indianapolis residency

2. Talk to your boss about copyright. Early and often.

Intimidated? It's OK - frank talks about important topics are never easy.

Can’t find the right words? Frank talks on important topics are never easy.

When I say “let’s talk copyright,” I bet you have one of two reactions: OK, where do you want me to start? Let’s set aside a few hours! Or you might say: Wait, I thought Wikipedia was supposed to be fun!

Enjoyable or not, copyright and licensing are hugely important to your host organization, and to their ability to contribute meaningfully to the Wikimedia vision. You probably don’t want to bore all your colleagues with all the details. But you should seek out decision-makers, and make sure they have a good grasp of how free licenses work, and how various kinds of works enter the public domain.

The way information flows through the world is changing really fast. When an organization invites a Wikipedian in Residence to join them, it is usually a seeking – among other things – to update their practices to better adapt to Internet culture, and to help them move into the future with confidence. And one of the central considerations, when updating information practices, is copyright. Help your host organization develop practices that both let it meet its goals, and also let it earn the respect of those Wikipedians who care about responsible management of copyright.

3. It’s your party. Make some introductions!

Photo by George Patton, public domain.

Photo by George Patton, public domain.

Some Wikipedian in Residence programs are a few weeks; a few last for years. But they always end! An organization usually hopes a Wikipedian in Residence will build lasting and sustainable ties to Wikipedia – and there may be similar expectations from the Wikipedia side, as well. So one of the best things you can do is to help your organization – its curators, librarians, or staff – meet other Wikipedians, and learn how to interact in their strange environment.

You can do this through physical events, online, or better yet, both! In-person events like a Wikipedia workshop, an edit-a-thon, or a backstage pass will usually be more familiar to your organization, and can also help you build visibity around your work. Online engagement might mean using something like Skype or Google Hangouts, or it might mean guiding your host organization in navigating a WikiProject’s talk pages or an email list. Ideally, you should use all of these tools, and actively reach out to Wikipedians (both locally and internationally) to join you.

4. Whoa there! Don’t do it all yourself!

Photo CC BY Dan & Susie Romer.

Dan & Susie Romer, CC BY

If you’ve been an active contributor to Wikipedia, you’ve probably had this experience: you leave a note on a talk page, hoping somebody will weigh in on an idea; and six months later, there’s no response. So you might have developed a habit of boldly adding material to Wikipedia, relying on your understanding of what is appropriate, without worrying too much about how any one addition is received.

If so, as a Wikipedian in Residence, you should reflect on that approach. If you’re adding a basic fact to an article, maybe that creates an opportunity to show a colleague how to format a reference. Or, maybe you’ve spent the last three months trying to persuade your host to release a collection of photos under a free license – and have finally found success. Congratulations! But before you stay up all night uploading them yourself, consider the benefits of showing a few colleagues how to properly use the Wikimedia Commons upload wizard.

5. Be extra clear about your role

By Aileen Denton

By Aileen Denton

If you’re doing the kind of stuff discussed above, this part will come naturally: you will be clearly expressing who you work for, and how you’re approaching your work, as you add material to Wikipedia. But regardless, you should give some thought to it. Make sure that readers and editors who care about your topic, and would want to know about your involvement, have a reasonable chance of learning about it.

At minimum, your user page should clearly explain your Residency, and how you are approaching it. If you’re working actively on specific articles, you should also leave notes on their talk pages, and/or the talk pages of relevant WikiProjects. If you’re unsure, ask another Wikipedian for their take – an independent perspective can help a lot. (And hopefully, the Wiki Strategies Statement of Ethics can serve as a useful guide.)

As you think about this, remember that others will be following your lead. Don’t just meet the bare minimum – set a high standard that will give your host a great example to follow in the future. And when you do one-on-one consultations, workshops, presentations, be sure to cover this topic, and help your colleagues create user pages and the like.

In conclusion…

You have an opportunity to bridge gaps, using multiple forms of communication. Have fun with it! Photo by Hildabast, licensed CC BY-SA 3.0.

You have an opportunity to bridge gaps, using multiple forms of communication. Have fun with it! ©Hildabast, CC BY-SA 3.0.

Whether your residency is three weeks or three years, your last day will arrive before you know it! As it approaches, you will probably start to realize that you are the most informed person on the planet about the intersection between your host organization and Wikipedia. And that’s no small feat.

You should make sure your knowledge lives beyond your residency – for the benefit of both for your host and other Wikipedians. Did you learn anything useful from the kind of activities discussed above? Great!

Consider capturing those lessons in a “how to engage with Wikipedia” document for your host organization. Your colleagues will want to refer to it when their memories start to fade: Wait, how do I make a wikilink? What are the different licensing choices, again?

And also, tell Wikipedians how it went, and what opportunities are still in play with your host! Write a blog post (or three!) Send an email to the cultural partners email list. Give a talk at a conference like Wiki Conference USA. Tell us what worked, and what didn’t – we’re all eager to learn from your experience!

by Pete Forsyth at April 14, 2014 05:33 PM

Wikimedia UK

Thoughts on the Wikimedia Conference

The image shows a very long, hand drawn mural outlining the conference

A mural storyboard from the Wikimedia Conference

This piece was written by Stevie Benton, Wikimedia UK Head of External Relations, and is one of a series of reflections on the Wikimedia Conference 2014 in Berlin

As I write it’s the final day of the Wikimedia Conference in Berlin. It’s been a very busy but incredibly worthwhile few days. It is my first time attending a Wikimedia Conference and having also never attended Wikimania I wasn’t at all sure what to expect.

The reality of the conference is that it’s hard work. From the outside looking in this may not be obvious but I can promise you this is the case.

The conference featured a very full programme of presentations, workshops and discussions alongside plenty of opportunities to meet with people from across the chapters and the Wikimedia Foundation. I was fortunate enough to be personally involved in the delivery of one of the sessions, a panel about advocacy. This proved to be a very helpful session and there was a strong consensus that achieving favourable reform to copyright should remain a focus of movement advocacy.

It was extremely useful to meet with so many people that I have worked with for the last couple of years that I’ve only encountered online. I was very encouraged by the diversity of the conference and its very international nature. There are so many intelligent and motivated people, both volunteers and staff, working to share the sum of all human knowledge and I was inspired by them all.

Our movement is in great shape. The progress made by chapters and the Wikimedia Foundation would be difficult to overstate. Wikimedia UK is no exception to this. There is admiration for the progress our chapter has made in terms of governance, strategy and measuring our impact and the lessons that we have learned are being widely shared across the movement.

The strongest message I have taken away from the conference is that the future looks very bright indeed, albeit with much work to be done. I’d like to say a huge thank you to the volunteers and staff that made this conference such a success – they did a remarkable job of keeping things organised, helping people get to where they needed to be and welcoming so many people to their office. Without their efforts the conference wouldn’t have been such a productive, useful and enjoyable experience.

by Stevie Benton at April 14, 2014 11:30 AM

Conference scholarships

Group Photo, WikiSym+OpenSym2013, Hong Kong

Group photo of participants of WikiSym+OpenSym 2013 in Hong Kong

As a part of Wikimedia UK’s continued efforts to support the Wikimedia community the the UK, we regularly offer scholarships to enable attendance at international conferences and meetings. Past scholarships have enabled members to attend previous years’ Wikimania, WikiSym and Wikimedia Hackathon events, such as the 2013 Hackathon event in Amsterdam. This year, as a result of our support, Wikimedians in the UK have been able to attend the European Parliament in Strasbourg to take photos and videos of European Parliament members and attend the EduWiki conference in Belgrade, Serbia to share Wikimedia UK’s experiences from our education-related outreach activities.

Two further scholarship opportunities are now available, the first for Open Knowledge Festival and the second to OpenSym. OKFestival, run by the Open Knowledge Foundation is an open data and open knowledge conference that will bring together over 1,000 people from more than 60 countries in a bid to encourage innovation in the open sector through sharing experiences and skills. Furthermore, the event is a celebration of the open movement itself and what it has already achieved. OpenSym, previously known as WikiSym, is the International Symposium on Wikis and Open Collaboration where researchers from all over the world gather to present their latest research and practice on “open access, open data, open education resources, IT-driven open innovation, open source, wikis and related social media, and Wikipedia”. Both of these conferences are being held in Berlin, Germany with OKFestival on 15th-17th July and OpenSym on 27th-29th August.

To qualify for either scholarship, you must be based in the UK, be able to travel to Berlin and attend all days of the event, and agree to produce a public report (which may be published on the Wikimedia UK blog and in our newsletters) summarising the key things that you have taken from the event. Applicants for OpenSym must also be engaging in research about Wikimedia or other free content projects. The scholarship will cover conference registration fee, travel, accommodation, along with a per diem allowance to cover local expenses.

Complete this online form by Sunday 20th April to apply for a scholarship to OKFestival. The deadline for OpenSym scholarship is Sunday 30th April, and you can apply here.

by Katie Chan at April 14, 2014 09:46 AM

Jeroen De Dauw

Wikibase DataModel 0.7.3 released

I am happy to announce the 0.7.3 release of Wikibase DataModel.

Wikibase DataModel is the canonical PHP implementation of the Data Model at the heart of the Wikibase software. It is primarily used by the Wikibase MediaWiki extensions, though has no dependencies whatsoever on these or on MediaWiki itself.

This release contains a new API for working with labels, descriptions and terms, and it deprecates the old API for those.

At the core of the new API is a simple value object representing an until now unnamed domain concept: the part of an Item or a Property that has all those labels, descriptions and terms. The name we gave it is Fingerprint. (Credits and blame for the name go to Lydia :)) The Entity class now has a getFingerprint and a setFingerprint method. Fingerprint itself has getLabels, getDescriptions and getAliases methods. The first two return a TermList, which is made up from Term objects. The later returns an AliasGroupList which consists out of AliasGroup objects.

Why these changes? What was wrong with this approach?

$entity->setLabels( array( 'en' => 'foo', 'de' => 'bar' ) );

The old API is defined by no less than 17 methods in Entity. They add a lot of code to it, contributing to Entity being the highest complexity class in DataModel. That our core (DDD) entity is also our most scary class is obviously not good. Moving the responsibility to a value object inside of Entity is a clean way to tackle this problem, and is also more in line with how the other parts of Entities, such as they list of Claims are handled. On top of the complexity issue, the old API also does badly interface segregation wise. Most code dealing with Terms (ie Labels, Descriptions and Aliases) will not care about all the rest of Entity. Hence it makes no sense to have to feed it an entire Entity object while a Fingerprint or one of its composited in objects would make more sense.

dm-complexity

Another important reason to move into this direction is that I want to see Entity go. If you are familiar with the project, this might seem like a quite preposterous statement. Kill Entity? How can I possibly think that is a good idea, and how will the code be able to still work afterwards? In short, Entity tries to unify things that are quite different in a single class hierarchy. The difference between those objects creates creates a lot of weirdness. Entity contains a list of Claim, while Item, one of it’s derivatives, requires a list of Statement, Statement being a derivative of Claim. And not all Entities we foresee will have Claims, Fingerprint, etc. The only thing they will all have is an EntityId, and all we need to facilitate that is a simple HasEntityId interface. All the rest, including the Fingerprint, can be composited in by classes such as Item and Property that implement the appropriate interfaces. Those changes are for the next big release, so if you are using DataModel, it is recommended you switch to using the new Fingerprint API as soon as possible.

And I’m still done with the list – wow. A final reason for the change is that the old API was not only ugly (weird function signatures in places), it was also quite inconsistent in its implementation. It has TODOs in there since the start of the project that state things such as “figure out if we care about duplicates” and “are empty strings valid?”. The new implementation properly takes care of these things, and does no in all cases where it should rather than only in assorted random functions. That those old TODOs remained there for nearly two years go to show how likely it is people “go back to fix it”.

You can do everything you could do with the old implementation with the new one. There are however some things that might be slightly more cumbersome for now, especially in code that is taking in entire Entity objects while only needing a Fingerprint. As we migrate to the new implementation, it will become clear what convince functions will pay for themselves, so those will likely be added in the near future. At the same time several tasks are already easier to do now. The new value objects will also likely provide a good place to put functionality that was oddly places before.

For a list of changes, see the release notes.

And in anticipation of the next release, have some WMDE kittens:

2014-04-14 01.30.54

by Jeroen at April 14, 2014 12:08 AM

Tech News

Tech News issue #16, 2014 (April 14, 2014)

TriangleArrow-Left.svgprevious 2014, week 16 (Monday 14 April 2014) nextTriangleArrow-Right.svg
Other languages:
العربية 100% • ‎čeština 100% • ‎English 100% • ‎español 100% • ‎français 100% • ‎עברית 96% • ‎日本語 64% • ‎한국어 70% • ‎polski 96% • ‎русский 100% • ‎українська 100% • ‎中文 100%

April 14, 2014 12:00 AM

April 13, 2014

Amir E. Aharoni

Where to read about the Elections in India?

There is an election process going on in India, which is frequently called “the world’s largest democracy” and an “upcoming world power”. Both descriptions are quite true, so elections in such a country should be pretty important, shouldn’t they?

Because of my work I have a lot of Facebook friends in India, and they frequently write about it. Mostly in English, and sometimes in their own languages—Hindi, Kannada, Malayalam and others. Even when it’s in English I hardly understand anything, however, because it is coming from people who are immersed in the India culture.

It is similar with Indian English-language news sites, such as The Times of India: The language is English, but to me it feels like information overload, and there are too many words that are known to Indians, but not to me.

With English-language news sites outside of India, such as CNN, BBC and The Guardian it’s the opposite: they give too little attention to this topic. I already know pretty much everything that they have to say: a huge number of people are voting, Narendra Modi from the BJP is likely to become the new prime minister and the Congress party is likely to become weaker.

Russian and Hebrew sites hardly mention it at all.

What’s left? Wikipedia, of course. Though far from perfect, the English Wikipedia page Indian general election, 2014 gives a good summary of the topic for people who are not Indians. It links terms that are not known to foreigners, such as “Lok Sabha” and “UPA” to their Wikipedia articles, so learning about them requires just one click. When they are mentioned in The Times of India, I have to open Wikipedia and read about them, so why not do it in Wikipedia directly?

This also happens to be the first Google result for “india elections”. And if you go the page “Elections in India” in Wikipedia, a note on the top conveniently sends you directly to the page about the ongoing election process. Compare this to the Britannica website: searching it for “india elections” yields results that are hardly useful—there’s hardly anything about elections in India in general, let alone about the current one.

One thing that I didn’t like is the usage of characteristic Indian words such as “lakh” and “crore”, which mean, respectively, “a hundred thousands” and “ten millions”. I replaced most of their occurrences in the article with the usual international numbers, and I think that I found a calculation mistake on the way.

So while Wikipedia is, again, far from perfect, its “wisdom of the crowds” system works surprisingly well time after time.


Filed under: India, Wikipedia Tagged: elections

by aharoni at April 13, 2014 12:49 PM

Gerard Meijssen

#Wikidata - ประนอม รัชตพันธุ

When #Quality is the objective and when quality is to be measured, it helps when there is something that demonstrates how Wikidata provides quality. The 2014 deaths provides a great opportunity; currently we know about 2108 people who died.

Making this list complete is not always that easy, the lady whose portrait you see, has an article on the Thai Wikipedia it is indicated that she died and, with Google translate I find the following:
Born October 1, 2457
Province, Thailand
Died 17 January 2557 (99 years).
Bangkok
Thailand Nationality
The Thai dates have to be converted to Gregorian dates, it is great that there is functionality around that helps with the necessary conversion. The question that is if our Thai users can enter their dates in the Thai format.
Thanks,
      GerardM

by Gerard Meijssen (noreply@blogger.com) at April 13, 2014 07:01 AM

Pete Forsyth, Wiki Strategies

Is Wikipedia better than other encyclopedias?

I recently encountered the question: “Has Wikipedia surpassed the quality of traditional encyclopedias?” Here’s my answer:

Absolutely, yes. Wikipedia has established ways of thinking about encyclopedic “quality” that never existed before. Oh, a few ideas off the top of my head — before Wikipedia, nobody would have ever thought it might be possible to find, in a single general interest encyclopedia:

  • Easy ways to fix or expand the content
  • Links to articles on the same topics in hundreds of other language editions
  • Basic information local government, including state legislators, landmark legislation, etc.
  • Endless insights into the thinking of the people building the encyclopedia
  • Extensive lists of links to more authoritative or in-depth sources
  • Access to other people who share your interests and may be able to answer your questions, or help you figure out how to pitch in on building the encyclopedia

etc. etc.

There are of course some ways in which the traditional model is superior; they are worthy of attention. But in my opinion they are very much the footnote, not the headline.

To directly answer the final part of the question: no, Wikipedia — like traditional encyclopedias — should never be regarded as an infallible source. The main function of an encyclopedia has always been as a starting point, that can help a reader find his or her way to more reliable information. The fact that Wikipedia can be edited by anyone is its great strength; in addition, it serves as a continual reminder that we (as readers) should look more closely before taking what we read for granted.

by Pete Forsyth at April 13, 2014 03:06 AM

April 12, 2014

Laura Hale

Thank you Cindy

I’m emotionally crushed.  For the second time this week, I had to write an obituary about a female Wikimedian involved in addressing the movement’s gender gap.  Wikimedian Cindy Ashley-Nelson died at the Wikimedia Conference in Berlin early yesterday morning.  Her death follows that of Wikimedian activist Adrianne Wadewitz who died earlier in the week after a rock climbing accident on March 29.

Both women were inspiring in terms of their leadership, their contributions to Wikipedia while being active behind the scenes in movement governance, and their dedication.  Like myself, both believed that contributing to Wikimedia projects could change the world, and that knowledge is power.  Their individual contributions embody that.

While I did not have personal relationships with either, they served as role models in the community and brought attention to issues in the community as insiders that would not have otherwise been possible.  They participated in an environment that can at times be incredibly hostile towards women while being very successful.  I cannot easily see how the holes they left will be filled. :(

I am thankful that in their lives, they spent time contributing.  I hope they can continue to live on forever in the collective community memory.


by Laura H at April 12, 2014 02:35 PM

Cindy, in memoriam

Laura H:

Vale Cindy. Your efforts in the movement will not be forgotten.

Originally posted on Walk the Talk:

20140412-152531.jpg

Cynthia Ashley-Nelson died yesterday. She was attending the Wikimedia Conference as an AffCom member, and on Thursday had participated on her first annual AffCom meeting. The news about her death have surprised and shocked the people at the conference. I realise there are many people who might not be familiar with her, so I wanted to write a few words about her and the impact she made on those who knew her.

In my role as Board liaison to the Affiliations Committee, I had seen Cindy, as her friends called her, apply to become a member, and how she was finally elected to the committee. She had such a solid background, so relevant to the work AffCom does, she was such strong candidate, it was a no brainer for AffCom to elect her. They were not disappointed. Cindy was participative, incredibly engaged from the first day on the work of…

View original 652 more words


by Laura H at April 12, 2014 01:31 PM

Gerard Meijssen

#Wikidata - George Bookasta

Mr Bookasta has an article on the English Wikipedia. Until recently there was no item for him on Wikidata. With a new tool by Magnus, the Creator, items were added for all articles that are in the categories of people who were born or who died after 1850.

Since then two inter language links were added for Mr Bookstra. Thanks to a query we can track who is added to the category of 2014 deaths and are not known to be dead in Wikidata.

As all the people who died in 2014 have their deaths noted, it is only a matter of keeping up. The article for Mr Bookstra is small but includes many bits of information that can be added to Wikidata. One thing that I did not add is that he was also a bigband leader.
Thanks,
      GerardM

by Gerard Meijssen (noreply@blogger.com) at April 12, 2014 07:03 AM

Wikimedia Foundation

A GLAMorous romance

This post is available in 2 languages:
English  • Català

English

QR codes at Joan Miró Foundation, 2011

One of the most fruitful collaborations between the community of Catalan-speaking Wikipedians and the GLAM institutions (Galleries, Libraries, Archives, and Museums) is taking place at the Joan Miró Foundation in Barcelona. Let’s peer into this little love story between GLAMwiki pioneers.

The first rendez-vous occurred in September 2011, when the Foundation was preparing the exhibition “Joan Miró and the Scale of Evasion” using QRpedia to offer the visitors QR codes linking to Wikipedia articles. A wikimarathon followed by a translation campaign was organized by wikiGLAM volunteers in order to assure complete articles translated in several languages for the seventeen most remarkable works in the exhibition. The participation in this first experience consisted mostly of core editors who worked on the initial seventeen articles and created fifty more. All these articles, originally written in Catalan, were completed with a range of two to fifty translations. Their effort resulted in more than 12,000 readings of QR codes during the period of the exhibition.

This was the beginning of a great friendship; the GLAMwiki experiment proved that a community of motivated volunteers and the good predisposition from a welcoming institution could bring good results. However, after this first experience together, the volunteers and the institution each followed their own paths.

Miró Editathon, 2013

The next rendezvous would come two years later, in 2013. Espai 13, a space within the Joan Miró Foundation devoted to exhibit works of young and emerging artists, was celebrating its 35th anniversary. Wikipedia volunteers and the institution thought that this was a good occasion to work together again. They ran another wikimarathon together, the longest organized in Catalonia so far, lasting 35 hours, topped by the coincidence of creating the 400,000th article of Catalan Wikipedia during the event.

This time, core Wikipedia editors mingled with a legion of new users who came from universities and the fine arts scene. They created 121 articles (in Catalan, Spanish, English, and even French) about artists and commissioners involved with Espai 13 during its thirty-five-year history. The romance between Wikipedia and the Joan Miró Foundation made clear steps forward. The names of the viquipedistes were listed in the acknowledgments section of the exhibition, and a plaque was hung in the main room to remember the wikimarathon that created Wikipedia articles for all the featured artists.

The volunteers and the Foundation liked each other and decided to formalize their relationship at the end of 2013. In order to define an action plan, the first action taken was an audit of the contents related to Joan Miró in Wikimedia projects. This plan resulted in the hire of a Wikipedian-in-residence who worked at the Foundation for eight weeks. To increase the knowledge about Joan Miró and his works in different language Wikipedias, this editor dedicated most of the project time to develop materials created by the institution into free licensed publications.

This GLAMwiki collaboration will have its peak on Saturday, May 10th, with the organization of the Joan Miró Global Challenge. This event consists of a 10-hour global sprint (8:00 – 18:00 UTC) to create and expand a selection of ten articles related to Miró. This challenge is inspired by the Catalan Culture Challenge, another online activity open to writers of any language. Volunteers in Barcelona can attend the local wikimarathon at the Joan Miró Foundation, where they will be invited to a guided tour through their warehouse-archive and to edit together. A selection of materials will be made available to the local participants who will also receive a small gift in addition to the gratitude of the Foundation.

Be part of this action to spread all we know about Joan Miró and his works. Join this wikiGLAM love story.

Esther Solé, Wikipedian-in-Residence at Joan Miró Foundation

Català

Un amor GLAMurós

Una de les relacions més fructíferes entre la comunitat viquipedista catalana i les institucions GLAM s’ha esdevingut a la Fundació Joan Miró de Barcelona. Estem davant d’una petita història d’amor entre pioners del GLAMwiki.

Els primers cites van començar el setembre de 2011, quan la Fundació va incloure la tecnologia de la QRpedia a l’exposició “Joan Miró i l’escala de l’evasió”. Per tal que la informació sobre les 17 obres més destacades de l’exposició estigués disponible a la Viquipèdia en diversos idiomes, es va dur a terme una marató d’edicions, seguida d’una campanya de traduccions a diferents idiomes.

D’aquesta experiència, on principalment participaren core editors, es van treballar els 17 articles proposats, se’n van crear aproximadament 50 de nous i es va obrir un ventall de traduccions d’entre 2 i 50 idiomes, que van permetre que durant el temps que l’exposició va estar oberta al públic es registressin més de 12.000 lectures dels codis QR de les cartel·les. Aquest fou l’inici d’una gran amistat, la constatació que amb una comunitat de voluntaris motivats i una bona predisposició per part de la institució acollidora, els projectes GLAMwiki poden donar resultats. Tanmateix, un cop finalitzada l’experiència, cadascú va seguir el seu camí.

El retrobament va tenir lloc dos anys després, ja que l’Espai 13 de la Fundació Joan Miró, un espai expositiu pensat per presentar l’obra d’artistes joves i emergents, celebrava el seu 35è aniversari. Aquesta circumstància ens va semblar una bona ocasió per tornar a treballar junts i, davant la perspectiva d’una exposició commemorativa d’aquest 35è aniversari, es va dur a terme altra marató d’edicions. Aquesta viquimarató ha resultat ser la més llarga fins al moment (35 hores) i on hi va haver la coincidència que es va crear l’article 400.000 de la Viquipèdia en català.

Viquimarató de l’Espai 13 de la Fundació Joan Miró (2013)

En aquest esdeveniment, els core editors de la Viquipèdia en català es van mesclar amb una legió de nous usuaris provinents d’entorns universitaris i de l’àmbit de les belles arts, els quals van crear 121 articles nous —tant en català com en castellà, anglès i fins i tot francès— sobre diversos artistes i comissaris de les exposicions que han tingut lloc a l’Espai 13 al llarg d’aquests 35 anys de trajectòria. A més, l’idil·li entre la Viquipèdia i la Fundació Joan Miró ha fet un clar pas endavant quan els viquipedistes s’han vist mencionats als agraïments del catàleg de la mostra i quan a la primera sala de l’exposició es té un record per la viquimarató que permeté que tots els artistes participants a la mostra tinguessin un article a la Viquipèdia en diversos idiomes.

Ens agradàvem, i a finals de 2013 vam decidir formalitzar la nostra relació. La primera acció que es va dur a terme fou una auditoria de l’estat dels continguts mironians a la Viquipèdia en general, per tal de determinar línies d’actuació. Aquestes es concretaren en la incorporació d’una viquipedista resident a la Fundació, que durant 8 setmanes va treballar en l’obertura dels continguts que es generen a la institució per tal de millorar tant la presència com la qualitat del coneixement sobre Joan Miró i la seva obra que es troba disponible a la Viquipèdia en diversos idiomes. Aquesta tasca tindrà el seu punt culminant el proper 10 de maig, quan es durà a terme la “Joan Miró Global Challenge”, un esdeveniment de caràcter global amb l’objectiu que, entre les 10:00 i les 20:00h (GMT +1), es creïn i s’ampliïn els continguts d’una selecció de 10 articles de temàtica mironiana.

La proposta, inspirada en la Catalan culture challenge, està oberta a viquipedistes d’arreu del món, que estan convidats a participar des de casa seva o a venir presencialment a la Fundació Joan Miró de Barcelona per fer una visita al magatzem-arxiu de la institució i editar plegats. S’ha previst posar a disposició dels participants una col·lecció de materials per poder elaborar còmodament els articles en qüestió, i tots els participants tindran —a més de l’etern agraïment de la Fundació— una petita recompensa.

Us animeu a formar part d’aquesta història d’amor GLAM? Voleu formar part d’aquesta acció per portar Joan Miró i la seva obra arreu del món?

Esther Solé, viquipedista resident de la Fundació Joan Miró

2014-04-12: Edited to shorten a sentence in the English version

by ESM at April 12, 2014 12:26 AM

April 11, 2014

Wikimedia Foundation

Supporting innovation beyond the traditional IP regime: Using Wikipedia as a model

Michigan State College of Law Professor Sean Paper responds to comments on his presentation during the “Cultural Production Without IP” panel.

Intellectual property (IP) rights like copyrights, patents and trademarks are given to scientists and authors to reward them for their contributions to the arts and sciences. These exclusive rights allow them to monetize their work. But piracy and the sale of goods that infringe IP rights are steadily increasing. From 2000 to 2007, trade in counterfeit and pirated products increased 7.6% among all globally-traded commodities – and this number excludes all electronic piracy.[1] In the European Union alone, customs agents intercepted almost 40 million infringing articles trying to be imported into the member states in 2012.[2] Yet, despite the profit loss that undoubtedly comes with infringement, scientists and artists continue to create signalling that (1) economic incentives are not the only driving force behind innovation, and (2) laws outside IP are supporting this innovation.

On Sunday, March 30th, the Information Society Project (ISP) [4] at Yale Law School hosted the Innovation Law Beyond IP conference to explore these issues. The event brought together some of the most reputable scholars in IP to discuss how the law can be used to promote innovation outside the context of private IP rights. The discussion centered around the trends of innovation already occurring without IP protection and looked to develop areas of law that can play a positive role in supporting innovation beyond the domain of IP law. Wikimedia Legal Fellow Manprit Brar followed this discussion to think about what lessons can be learned for Wikimedia’s legal work.

Yale Law School Professor Amy Kapczynski opened the conference by framing the discussion of innovation law as having no one focus. She discussed the many alternative areas of law to IP that are used to help sustain and encourage innovation, including:

  • Procurement law – where governments can directly fund innovative work;
  • Tax law – that can provide tax incentives to creative industries;
  • Human capital law – where employment law and antitrust can be used to maximize innovation through the propertization of the “the inputs of innovation – people, their skills, experience, knowledge, professional relationships, creative and entrepreneurial energies, and the potential for innovating;”[5]
  • Regulatory law – using regulatory mandates to force innovation (e.g. by requiring car manufacturers to develop technology that meet certain fuel efficiency standards);
  • Tort law – imposing liability for failing to innovate (e.g. developing testing procedures to ensure products are safe);
  • Contracts law – where parties can contract for innovate work instead of waiting for the future private IP rewards for the final product.

Current IP practices are quickly becoming outdated as the world changes in response to massive technological developments. It is now more important than ever to look to these alternate areas of law to encourage innovation.

Stanford Law Professor Mark Lemley discussed how technology like the Internet is eating away at the artificial scarcity created by IP law. The advent of the Internet significantly reduced the cost of distributing content and expanded the bounds of distribution worldwide to anyone with an Internet connection. The Internet dissolved the separation between the creation of content and the distribution of it by allowing you to do it yourself, rather than forcing you to engage a middle person for distribution. As a result, people are creating and distributing their content at incredible rates – and they are doing this outside of the marketplace and without extensive IP protection. Attempts to maintain this artificial scarcity are prevalent in our society as our legal systems go after file sharers, our schools began to teach kids not to download from the Internet, and our politicians propose legislation to give IP owners greater power over the internet (think SOPA). But, in the end, these efforts are somewhat futile.

So, what motivates people to create content and share it with the world without ensuring their IP rights in the content? What incentives are currently available to encourage people to do this? These are the types of questions that scholars are now considering and applying to determine how the law can be used to support innovation without traditional IP protections. Throughout the conference, one thing was very clear: the role of IP is smaller than once thought compared to other existing infrastructure that allows creators to produce and distribute their work.

Cultural production without using IP as the primary incentive

One of the first panel discussions delved into contributions to culture and the related IP concerns, which is something that doesn’t often factor into the discussion of IP in the US. IP law is not often seen as important to enrich our culture as much as it is seen as a commercial tool to maximize profit. However, an overarching theme found in the research of Sean Pager and Jessica Silbey is that reputational incentives like attribution and integrity are very important to creators. Additionally, it appears copyright is more important at the distribution stage and not during creation. Authors and scientists don’t create with the intention of using IP to make money for themselves, but they will use it afterwards to protect their work.

Michigan State College of Law Professor Sean Pager discussed alternative modes of encouragement within the context of indie films and looked at how these could be used to encourage greater cultural diversity in the film industry. Familiar models used to encourage creation in the indie film industry include copyright, but direct state funding, tax incentives and certain infrastructure support for creating the films are also available. However, all the current models of support involve the use of gatekeepers, which means there are a narrow set of decision makers that all hold their own biases, which may not be the same as those held by society. As a result, the content produced is not as diverse as the society it is meant to serve. Pager suggests the use of distributed models of encouragement dissemination to bypass gatekeepers, which will ultimately result in more diverse content production.

To further explore the motivations behind innovation, Jessica Silbey presented a chapter of her book “Real Accounts from Creators and Innovators: Making Do with an IP Misfit”. For her book, she conducted a study using face to face interviews to determine if current US IP laws actually function “to promote the progress of science and useful arts, by securing for limited times to authors and Inventors the exclusive right to their respective writings and discoveries.”[6] Silbey interviewed authors, scientists, engineers, IP lawyers and business executives working in the field to determine what role IP law actually plays in their work. Her results showed a misalignment between the motivations for innovation and the current laws that exist to promote that innovation. Creators of content and inventors are largely concerned with their reputation among their peers and are focused on having their hard work and time valued, which is not a concept protected by IP laws. You won’t get a patent on your new invention just because you worked hard on it. The interviews did confirm that creators want to be able to convert the value of their work into tangible things that can be protected. Future scholarship should look to how IP can support production rather than focusing solely on using it to incentivize creation.

Privacy and innovation: Forever in conflict?

The next round table discussion focused on the link between privacy and innovation. The general thinking surrounding these concepts are that they are in indirect conflict with each other. One perspective is that privacy restrictions should be loosened to provide greater access to information in order to have the freedom to innovate. If personal information is the fuel for the information economy, privacy laws can be seen as barriers to the flow of this information, which hinders innovation. Participants noted the correlation between stronger privacy laws and weaker innovation in the European Union in comparison with the US. The participants were careful to note that this correlation does not necessarily prove a causative relationship between the two.

Participants also discussed how privacy is needed to encourage innovation. Privacy is required to generate trust in online ecosystems and through that trust, innovation can be generated. The round-table discussion touched on issues of commercialized surveillance by information businesses like Google and Facebook that use consumer information to maximize advertising profits. In their discussion, participants noted a possible need for privacy rules to enhance competition and thus innovation. The discussion generally reflected many of the issues recently raised by Wikimedia community members in the debate around the new Wikimedia privacy policy.

Using prizes and grants to stimulate innovation instead of IP

ZAP-Jonway-Alias-Electric-Car-MIS-turn4-jonway

Grants and prizes are now being looked to as alternative methods to IP rights in promoting innovation. Michael Burstein and Fiona Murray explored the governance challenges faced by prize competitions where innovation is being encouraged through direct rewards. The prizes currently awarded to the winners are not seen as substitutes to IP but complement existing IP rights. The researchers focused on the Progressive Insurance Automotive X Prize (PIAXP) in their study. They found the participants had many motivations aside from the prize money for competing, which included IP rights, but also the reputational benefits from winning as well as the simple desire to compete in a research challenge. Additionally, although the rules of the competition kept changing throughout, participants accepted these changes because they were perceived as legitimate and fair responses to how the competition was progressing. When thinking of ways that prizes can be used by the government to encourage innovation on a wider scale, the government is in a better position to value the innovation needed to ensure the prize is of a sufficient amount. The government has this advantage because they have an informational advantage in sectors such as public health, where innovation is most useful.

As a complement to the discussion of prizes, Bhaven Sampat held an intriguing talk about his paper “The Unexpected Political Economy of Serendipity,” which looks at the innovation that originates from grants. More specifically, his paper focuses on how research under National Institutes of Health (NIH) grants for specific areas leads to advances in other seemingly unrelated fields. His research consisted of connecting NIH grants with medical publications and then connecting those publications with patents and FDA-approved drugs using the Orange Book. Through these connections, Sampat was able to find the link between NIH grants and the commercially available pharmaceuticals created as a result of those grants. Sampat’s findings show that at least 30% of new pharmaceutical drugs seem to result from so-called “serendipitous discoveries.”

This research on medical grants and existing rule regimes governing them can be used to develop regulations to govern prize competitions. Burstein’s and Murray’s research that developing a perfect set of rules is unnecessary suggests it may be more effective to just provide any set of fair rules to garner the participation of innovators but not unnecessarily restrict the outcome of their research to allow for serendipitous discoveries.

Wikimedia Projects as a model of innovation beyond IP

Previously, the dominant theory in IP was that IP rights were needed to promote innovation and that without the protections afforded by IP law, the incentives to create would disappear. In reality, the reverse has happened. There are more books, videos, songs and content in general being produced than ever before. We can just take a look at the growth of projects like Wikipedia and Wikimedia Commons to see that. People enjoy creating and sharing their content online and because of the Internet, it is now possible for anyone with at least some talent to make a song or video at minimal cost. The Internet has ultimately unlocked the gatekeepers of creation. Before, if someone wanted to create music that people would hear, they needed a major record label to produce and distribute that music. But now, you can simply use your webcam to record your music and upload a video onto a site like YouTube and potentially reach millions of viewers. That is after all how artists like Justin Bieber, Bo Burnham, and Greyson Chance got their start! In sum, people create because they can, because they want to, because they are interested in it and not because IP laws will allow them to collect royalties 70 years after their death.

The Wikimedia projects can also be said to have unlocked the gatekeepers of creation as they allow anyone to become a volunteer and contribute to the writing of articles and contributing other content. Through the projects, individuals can choose to contribute as much or as little as they wish without having to get approval from a publisher or other middle person in order to distribute that content to the world. The underlying assumption of IP law is that we need to allow people to control what they create or else they won’t create. However, with the existence of things like Creative Commons (CC) licensing, this assumption does not hold. Editors on Wikipedia, for example, contribute original content to the sites under CC BY-SA licensing, which allows anyone to come and adapt their contributions (provided they give appropriate attribution). Essentially, editors relinquish control over their content upon creation – and they are okay with that because they support the mission of the project. The English-language Wikipedia alone has upwards of 4.4 million articles[7] and the encyclopedia is still growing every day.

Many of the conclusions drawn throughout the conference of how innovation occurs without IP can be seen through the way Wikimedia projects function. Creating under a CC license is consistent with the emerging literature that a significant motivator behind innovation is not the idea of holding exclusive IP rights over the product of that innovation. Instead, it is the recognition that comes with innovation that drives creators. All Wikimedia web pages that allow users to edit have a page history that clearly displays all the edits made to the page and the user who made the edit (should they choose to identify themselves).

Sikh pilgrim at the Golden Temple (Harmandir Sahib) in Amritsar, India

Through CC licensing, the requirement of attribution satisfies the need for recognition and thus encourages people to contribute their content for free for the benefit of others and the expansion of free knowledge for the world. The Wikimedia movement has also already started using prizes and grants to encourage innovation and expand content across several Wikimedia projects. For example, the Wiki Loves Monuments contests encourage the uploading of high resolution photographs of some of the worlds most beautiful monuments. The outcome of projects like this are beautiful images like those selected in the Commons Photos of the Year competition, being freely licensed to the world. The dedicated Wikimedia community is a prime example of how valuable innovation occurs outside of the IP context, proving that IP is not integral to innovation.

IP Law’s future role in innovation

Wikimedia projects function contrary to the economic IP theory that exclusive private rights to IP are needed to encourage creation, showing that current trends of innovation cannot be explained by existing economic models that underlie IP law. Efforts to simplify the motivations for innovation will consistently fail so regulation needs to continuously adapt to the dynamic nature of human innovation. As the scholarship surrounding innovation advances, it will be interesting to see what regimes are developed to encourage further innovation.

 

Manprit Brar, Legal Fellow

Yana Welinder, Legal Counsel

 

  1. http://www.jec.senate.gov/public/index.cfm?a=Files.Serve&File_id=aa0183d4-8ad9-488f-9e38-7150a3bb62be
  2. http://ec.europa.eu/taxation_customs/customs/customs_controls/counterfeit_piracy/statistics/index_en.htm
  3. The ISP is dedicated to developing the scholarship surrounding the impact of “the Internet and new information technologies for law and society, guided by the values of democracy, development and civil liberties.” http://www.yaleisp.org/about/history
  4. http://balkin.blogspot.com/2014/03/human-capital-law.html
  5. https://en.wikipedia.org/wiki/Wikipedia:Size_of_Wikipedia
  6. Article I, Section 8, Clause 8 of the United States Constitution, https://en.wikipedia.org/wiki/Copyright_Clause
  7. Similarly, technologies like the 3D printer have removed the separation between the act of design and the act of manufacturing a product so that you can design and manufacture products yourself.

by Manprit Brar at April 11, 2014 09:25 PM

Gerard Meijssen

#Wikidata and naval history

The answer of what a GLAM hopes to find in Wikidata can be surprising. "We have several paintings of the Raid on the Medway and, we would like to find data about that battle and be able to place our paintings as illustrations in the sequence of events".

One Dutch admiral, Michiel de Ruyter, has a painting hanging near paintings of the battle and, it would be great when he can be associated to these events as well.

When you have a look at the info-boxes of the battles that were fought as part of the second Anglo-Dutch war, you will find the commanders of the opposing sides. You will find that the treaty of Breda ended this war.

The question is very much how to include all these facts in Wikidata. When we do and when we include information on the many historic events that have an item, we become extremely valuable to our partners.
Thanks,
       GerardM

by Gerard Meijssen (noreply@blogger.com) at April 11, 2014 02:37 PM

Laura Hale

“He had two knives” or when is a fact a fact?

When is a knife a knife?

“He had to knives” and “Police said he had two knives” are two separate facts. One of the problems some new journalists on English Wikinews is recognizing what is a fact, and the above is a classic example. It is really easy to write opinions or assertions as facts without intending to.

A fair amount of Wikinews writing is synthesis writing.  We may not be able to verify the facts ourselves by talking to sources, witnessing an event ourselves, or reading the original source material.  It is really important to understand the facts that the sources we have present to us. If a source says, “Police said he had two knives,” the source is asserting that it is not a fact that he had two knives.  The fact is the claim.  This is very different from a claim of “He had two knives,” where “he had two knives” is the fact.

This might seem like a minor quibble, but it has the potential to be hugely important.  Picture a court case.  All the evidence is clear: “He had two knives” which he used to make a peanut butter and jelly sandwich.  Lots of people saw him with two knives.  There were two dirty knives in the sink which had his fingerprints on them.  He admitted to having two knives, and using them to make sandwiches. There was a picture of him using two knives to make a sandwich.  His name was engraved on the  knives.  We know those knives are his.  That is a fact.

In other situations, this may not be a fact.  There was no picture of him with two knives.  They did not have his fingerprints on them.  The knives were not found in his house.  He denied that the knives belonged to him and that he used them to make a sandwich.  On the other hand, the police claim that he had two knives.  In this case, maybe the knives do belong to him.  (He could have wiped down his fingerprints, taken the knives out of the house, lied about not owning the knives and not making a sandwich.)  What we do know as a fact is the police made this claim.

These finer points do matter, and they impact how people understand the news they read.


by Laura H at April 11, 2014 02:53 AM

April 10, 2014

Wikimedia Tech Blog

Remembering Adrianne Wadewitz

Portrait of Adrianne Wadewitz at Wikimania 2012 in Washington, DC.

Each of us on the Wikipedia Education Program team is saddened today by the news of Adrianne Wadewitz’s passing. We know we share this sadness with everyone at the Wikimedia Foundation and so many in the Wikimedia and education communities. Our hearts go out to all of you, her family and friends. Today is a time for mourning and remembering.

Adrianne served as one of the first Campus Ambassadors for the Wikipedia Education Program (then known as the Public Policy Initiative). In this role, she consulted with professors, demonstrated Wikipedia editing and helped students collaborate with Wikipedia community members to successfully write articles. As an Educational Curriculum Advisor to the team, Adrianne blended her unique Wikipedia insight and teaching experience to help us develop Wikipedia assignments, lesson plans and our initial sample syllabus. Her work served as a base for helping university professors throughout the United States, and the world, use Wikipedia effectively in their classes.

Adrianne was also one of the very active voices in the Wikimedia community urging participation and awareness among women to tackle the project’s well-known gender gap. She was an articulate, kind, and energetic face for Wikipedia, and many know that her work helped bring new Wikipedians to the project. The Foundation produced a video exploring Adrianne’s work within the Wikipedia community in 2012.

Many in the Wikimedia community knew her from her exceptional and varied contributions, especially in the areas of gender and 18th-century British literature – in which she received a PhD last year from Indiana University, before becoming a Mellon Digital Scholarship Fellow at Occidental College. Since July of 2004, she had written 36 featured articles (the highest honor for quality on Wikipedia) and started over 100 articles – the latest being on rock climber Steph Davis.

Adrianne touched many lives as she freely shared her knowledge, expertise and passions with Wikipedia, her students, colleagues, friends and family. She will be deeply missed by all of us. Our condolences go out to her family during these very difficult times.

Rod Dunican
Director, Global Education

Wikipedia Education Program

  • See Adrianne’s user page on the English Wikipedia, her Twitter account, her home page and her blog at HASTAC (Humanities, Arts, Science and Technology Alliance and Collaboratory)
  • Wikipedians have begun to share their memories and condolences about Adrianne on her user talk page.
  • The leadership of the Wiki Education Foundation, where Adrianne was a board member, have also expressed their condolences.
  • Memorial post from HASTAC Co-founder Cathy Davidson.
  • Wikinews story on the passing of Adrianne Wadewitz.

by Rod Dunican at April 10, 2014 09:24 PM

This month in GLAM

by Admin at April 10, 2014 08:21 PM

Alex Stinson (Sadads)

Losing Adrianne Wadewitz

I woke up this morning to an incredibly shocking email in my mailbox. I got an automatic update from HASTAC announcing the publication of this blog post: http://www.hastac.org/blogs/cathy-davidson/2014/04/10/remembering-adrianne-wadewitz-scholar-communicator-teacher-leader . My co-author, wiki-friend, and mentor in thinking about Wikipedia in the Digital humanities was gone, having fallen in one of her favorite pass-times.

All day I have been shaking from the loss. It’s not that I knew her particularly well personally: we had mostly interacted through digital media and have only met in person at several Wikimedia related events. It’s that I know that the common mission we shared bridging Wikipedia and Digital Humanities community has gotten unimaginably harder. Her contribution was tireless and compelling and finding anyone to fill her shoes will be nigh impossible. This loss seems keen for me: as an aspiring communicator of that space, Adrianne was an incredible mentor and model. She had incredible energy and voice, travelling across the United States and the World to spread that vision. She actively delivered incisive critiques of Wikipedia, the general response of scholars in shaping that space, and the need to place Women, the humanities and the underprivileged into our public knowledge record.

Just a month ago, Adrianne and I were fighting through our rejection of a paper from an Academic journal on the place of history and historical process in Wikipedia. Today I control her intellectual property in that article, as we had yet to find another platform for publishing it. Moreover, we had talked about something beyond our research in that first article: beginning to really understand, through large scale analysis, how women and humanities are problematically represented in Wikipedia. Without her voice helping me hone and shape those ideas, and without her experience helping assuage the fears I have entering the larger academic community, I am feeling blinded. I need help, and gladly welcome collaboration to meet her goals. Hopefully, we can use this tragedy to find a way to dedicate more research to her vision.

My losses seem rather small when compared to the impact that she clearly had on her family, friends, students and colleagues near her. But I can’t help but think how many internet users, scholars and learners the world over will never understand what they lost with her passing.


by Sadads at April 10, 2014 08:19 PM

Jeroen De Dauw

Diff 1.0 released!

I’m very happy to announce the 1.0 release of the PHP Diff library.

Diff is a small PHP library for representing differences between data structures, computing such differences, and applying them as a patch. For more details see the usage instructions.

I created this library as part of the Wikibase software for the Wikidata project. While it is used by Wikibase, it’s fully independent of it (and has no dependencies on any other PHP code). The first release was during September 2012, and since then it has seen a new release with refinements and new features every few months. As we’ve been using it on Wikidata.org for over a year, and there are no known bugs with the library, it can be considered to be quite stable and robust.

The 1.0 does not add anything in terms of functionality. The primary change it brings is PSR-4 compliance. It also removes some old constructs that had been deprecated for a long time already. A detailed list of changes can be found in the release notes.

by Jeroen at April 10, 2014 04:59 PM

Laura Hale

Vale Adrianne Wadewitz

The Wikimedia movement is both very large, and very small. I never had the pleasure of meeting Adrianne Wadewitz but I was very aware of her work and her role in promoting the inclusion of women as participants and topics of articles on Wikipedia. She was very effective at drawing attention to a problem that needs attention, at a site where people sometimes form their core base of knowledge about a topic. Thus, it was sad news to learn of her passing this morning. :( She did great work and appeared to do so without alienating a lot of people, something that can be very difficult to do in the Wikimedia community. Her contributions to making Wikipedia, and the world, a better place will be missed.


by Laura H at April 10, 2014 04:38 PM

Wikimedia Tech Blog

Wikimedia’s response to the “Heartbleed” security vulnerability

English

Logo for the Heartbleed bug

On April 7th, a widespread issue in a central component of Internet security (OpenSSL) was disclosed. The vulnerability has now been fixed on all Wikimedia wikis. If you only read Wikipedia without creating an account, nothing is required from you. If you have a user account on any Wikimedia wiki, you will need to re-login the next time you use your account.

The issue, called Heartbleed, would allow attackers to gain access to privileged information on any site running a vulnerable version of that software. Wikis hosted by the Wikimedia Foundation were potentially affected by this vulnerability for several hours after it was disclosed. However, we have no evidence of any actual compromise to our systems or our users’ information, and because of the particular way our servers are configured, it would have been very difficult for an attacker to exploit the vulnerability in order to harvest users’ wiki passwords.

After we were made aware of the issue, we began upgrading all of our systems with patched versions of the software in question. We then began replacing critical user-facing SSL certificates and resetting all user session tokens. See the full timeline of our response below.

All logged-in users send a secret session token with each request to the site. If a nefarious person were able to intercept that token, they could impersonate other users. Resetting the tokens for all users has the benefit of making all users reconnect to our servers using the updated and fixed version of the OpenSSL software, thus removing this potential attack.

We recommend changing your password as a standard precautionary measure, but we do not currently intend to enforce a password change for all users. Again, there has been no evidence that Wikimedia Foundation users were targeted by this attack, but we want all of our users to be as safe as possible.

Thank you for your understanding and patience.

Greg Grossmeier, on behalf of the WMF Operations and Platform teams

Timeline of Wikimedia’s response

(Times are in UTC)

April 7th:

April 8th:

April 9th:

April 10th:

Frequently Asked Questions

(This section will be expanded as needed.)

  • Why hasn’t the “not valid before” date on your SSL certificate changed if you have already replaced it?
    Our SSL certificate provider keeps the original “not valid before” date (sometimes incorrectly referred to as an “issued on” date) in any replaced certificates. This is not an uncommon practice. Aside from looking at the change to the .pem files linked above in the Timeline, the other way of verifying that the replacement took place is to compare the fingerprint of our new certificate with our previous one.

You can translate this blog post.


 

Deutsch

Wikimedias Reaktion auf die „Heartbleed“-Sicherheitslücke

Logo des Heartbleed-Bugs

Am 7. April wurde ein schwerwiegender Fehler in einem zentralen Baustein der Internet-Sicherheit (OpenSSL) veröffentlicht. Der Fehler wurde nun auf allen Wikimedia-Wikis behoben. Wenn du lediglich Wikipedia ohne ein Benutzerkonto liest, musst du nichts weiter tun. Wenn du ein Benutzerkonto bei irgendeinem Wikimedia-Wiki hast, musst du dich erneut anmelden, bevor du es wieder benutzen kannst.

Der Fehler, der Heartbleed genannt wird, erlaubte es Angreifern, auf privilegierte Informationen auf jeder beliebigen Webseite zuzugreifen, die die vom Fehler betroffene Versionen dieser Software verwendeten. Wikis, die von der Wikimedia Foundation betrieben werden, waren möglicherweise über mehrere Stunden nach Veröffentlichung der Sicherheitslücke davon betroffen. Allerdings haben wir keine Hinweise darauf, dass unsere Systeme tatsächlich angegriffen wurden, und die Konfiguration unserer Server sollte es Angreifern erschwert haben, durch die Sicherheitslücke Passwörter von Benutzern zu entwenden.

Nachdem wir auf die Sicherheitslücke aufmerksam gemacht wurden, begannen wir damit, all unsere Systeme mit korrigierten Versionen der fraglichen Software auszustatten. Danach fingen wir an, kritische, für den Benutzer sichtbare SSL-Zertifikate auszutauschen und alle Benutzersitzungen zu beenden. (Der vollständige Verlauf ist weiter unten dokumentiert.)

Alle angemeldeten Benutzer senden mit jeder Anfrage an die Seite ein geheimes Session Token. Wenn böswillige Angreifer dieses Token abfangen würden, so könnten sie damit sich als andere Benutzer ausgeben. Dadurch, dass wir alle Session Tokens zurückgesetzt haben, ergibt sich der Vorteil, dass alle Benutzer eine neue Verbindung mit den Servern aufbauen, die die korrigierte Version von OpenSSL verwenden, was diesen potenziellen Angriff unmöglich macht.

Zur Sicherheit empfehlen wir allen Benutzern, ihr Passwort zu ändern, aber zur Zeit haben wir nicht vor, dies zu erzwingen. Nochmal: es gibt keine Hinweise darauf, dass Benutzer der Wikimedia Foundation durch diesen Angriff betroffen sind, aber wir wünschen uns größtmögliche Sicherheit für alle Benutzer.

Vielen Dank für dein Verständnis und deine Geduld.

Greg Grossmeier, im Namen der WMF Operations und Platform Teams

Verlauf der Reaktion durch Wikimedia

(Alle Zeiten in UTC)

7. April:

8. April:

9. April:

10. April:

Häufig gestellte Fragen

(Dieser Abschnitt wird bei Bedarf erweitert.)

  • Warum wurde das „nicht gültig vor“-Datum des SSL-Zertifikats nicht geändert, als es ersetzt wurde?
    Der Aussteller unserer SSL-Zertifikate behält das „nicht gültig vor“-Datum (das manchmal auch fälschlich als „ausgestellt am“-Datum verstanden wird) in allen ersetzten Zertifikaten bei. Dies ist nicht ungewöhnlich. Um zu überprüfen, dass der Wechsel stattgefunden hat, kannst du die Änderung an den .pem-Dateien oben im Verlauf nachvollziehen oder den Fingerabdruck des neuen Zertifikats mit dem des alten vergleichen.

Español

Respuesta de Wikimedia ante la vulnerabilidad de seguridad “Heartbleed”

Logotipo del error Heartbleed

El 7 de abril se reveló un problema generalizado en un componente central de la seguridad en Internet (OpenSSL). Ya hemos remediado esta vulnerabilidad en todos los wikis de Wikimedia. Si usted solo lee Wikipedia sin crear una cuenta, no necesita realizar ninguna acción. Si tiene una cuenta de usuario en cualquier wiki de Wikimedia, tendrá que iniciar una sesión nueva la próxima vez que use su cuenta.

El problema, conocido como Heartbleed, permite a posibles atacantes el acceso a información privilegiada en cualquier sitio que utilizara una versión vulnerable de ese software. Las wikis hospedadas por la Fundación Wikimedia estuvieron expuestas durante varias horas después de darse a conocer este punto débil. Sin embargo, no tenemos ninguna evidencia de que nuestros sistemas o la información sobre nuestros usuarios hayan sido afectados, y, debido a la forma particular en nuestros servidores están configurados, hubiera sido muy difícil que un atacante se aprovechara de la vulnerabilidad con el fin de obtener las contraseñas de los usuarios.

Después de habernos enterado del problema, comenzamos a actualizar todos nuestros sistemas con versiones reparadas del software en cuestión. Después empezamos a sustituir certificados SSL críticos y a reestablecer todas las claves de sesión de usuario. El historial completo de nuestra respuesta se muestra a continuación.

Todos los usuarios registrados envían una señal de sesión secreta con cada solicitud al sitio. Si una persona malintencionada fuera capaz de interceptar esa señal, podría suplantar a otros usuarios. La reposición de las señales para todos los usuarios tiene la ventaja de hacer que todos los usuarios se vuelven a conectar a nuestros servidores utilizando la versión actualizada y fija del software OpenSSL, eliminando así este ataque potencial.

Le recomendamos cambiar su contraseña como medida de precaución estándar, pero no tenemos la intención actualmente de hacerlo obligatorio para todos los usuarios. Reiteramos que no hay evidencia de que los usuarios de la Fundación Wikimedia fueran el blanco de este ataque, pero queremos que todos nuestros usuarios estén lo más seguros posible.

Gracias por su comprensión y paciencia.

Greg Grossmeier, en nombre de los equipos de operaciones y plataforma de la Fundación Wikimedia.

Cronología de la respuesta de Wikimedia.

(Los horarios están en UTC)

7 de abril:

8 de abril:

9 de abril:

10 de abril:

Preguntas frecuentes

(Esta sección se expandirá conforme sea necesario).

  • ¿Por qué no cambia la fecha “no válido antes de” en el certificado SSL si ya lo han reemplazado?
    Nuestro proveedor de certificado SSL conserva la fecha original del “no válido antes de” (al que algunas veces se refiere incorrectamente como fecha de “publicado en”) en cualquier certificado reemplazado. No se trata de una práctica inusual. Además de observar los cambios en los archivos de tipo .pem enlazados en la cronología, se puede también verificar que el reemplazamiento se llevó a cabo comparando la huella digital (fingerprint) del nuevo certificado con el del anterior.

Italiano

Risposta di Wikimedia alla vulnerabilità “Heartbleed”

Logo per il bug Heartbleed

Il 7 Aprile è stato scoperto un problema in un componente centrale della sicurezza in Internet (OpenSSL). La vulnerabilità è stata ora individuata e risolta su tutti i progetti di Wikimedia. Se consulti semplicemente Wikipedia senza aver creato un account, questo problema non ti riguarda. Se, invece, hai un account su uno qualsiasi dei progetti Wikimedia, avrai bisogno di fare nuovamente il log in la prossima volta che accederai.

La vulnerabilità, denominata Heartbleed, permetteva agli attaccanti di guadagnarsi l’accesso a informazioni privilegiate su qualsiasi sito utilizzasse una versione vulnerabile del software MediaWiki. I progetti wiki della Wikimedia Foundation sono state, pertanto, potenzialmente affette da tale problema per molte ore prima che il bug fosse scoperto. Non abbiamo, tuttavia, prove di nessuna compromissione dei nostri sistemi o delle informazioni utente e per la maniera in cui i nostri server sono configurati, sarebbe stato molto difficile per un attaccante riuscire a servirsi della vulnerabilità fino a compromettere le password degli utenti.

Dopo che ci siamo accorti del bug, abbiamo iniziato da subito l’aggiornamento di tutti i nostri sistemi tramite l’installazione di versioni aggiornate del software in questione. Abbiamo iniziato a sostituire i certificati SSL compromessi e abbiamo resettato i token delle sessioni utenti. Guarda in basso il timeline della nostra risposta alla vulnerabilità.

Tutti gli utenti che hanno effettuato l’accesso inviano un token segreto di sessione contenente le richieste di accesso al sito. Se un malintenzionato fosse capace di intercettare il token, potrebbe benissimo fingere di essere l’utente che ha mandato la richiesta al sito. Resettando tutti i token abbiamo ottenuto il vantaggio di dover far riconnettere tutti gli utenti ai nostri server che nel frattempo erano già stati aggiornati in modo tale da rimuovere la minaccia di un potenziale attacco.

Vi raccomandiamo, comunque, di cambiare la vostra password come misura standard di precauzione in questi casi ma non abbiamo intenzione di forzarvi a farlo. Ancora una volta ripetiamo, sia ben chiaro, che non sono state trovate tracce di alcun attacco contro gli utenti della Wikimedia Foundation. Il motivo che ci spinge a richiedervi di cambiare password è che vogliamo che i nostri utenti siano quanto più possibile al sicuro.

Grazie per la tua comprensione e la tua pazienza.

Greg Grossmeier, a nome del team Operazioni e Piattaforme della WMF.

Cronologia della risposta di Wikimedia

(Ore in UTC)

7 Aprile:

8 Aprile:

9 Aprile:

10 Aprile:

FAQ (Domande Frequenti)

(Questa sezione verrà estesa quanto sarà necessario)

  • Come mai non è cambiata la data del “Valido dal” sul certificato SSL se è vero che l’avete sostituito?
    I provider dei nostri certificati SSL mantengono la data originale del “valido da” (a volte incorretamente chiamata “data di installazione”) su ogni certificato sostituito. Questa non è da considerarsi una pratica insolita. A prescindere dal cambio dei file .pem linkati più su nella cronologia, un altro modo con cui si può verificare l’avvenuta sostituzione è quello di comparare la firma digitale del nuovo certificato con quello vecchio.

Nederlands

Wikimedia’s reactie op het Heartbleed-beveiligingslek

Logo van de Heartbleed bug

Op 7 april is er een groot beveiligingslek blootgelegd in een belangrijk component van de beveiliging van webpagina’s (OpenSSL). Op alle Wikimedia-wiki’s is dit lek inmiddels gedicht. In het geval u Wikipedia slechts, zonder gebruikersaccount, leest dan hoeft u geen actie te ondernemen. Als u op één of meerdere Wikimedia-wiki’s wél een account heeft dan zult u opnieuw in moeten loggen.

Het zogenaamde Heartbleed-lek gaf aanvallers de mogelijkheid vertrouwelijke informatie te benaderen op elke webpagina die een kwetsbare versie van OpenSSL gebruikte. Wiki’s van de Wikimedia Foundation waren tot enkele uren na het bekend worden van het lek kwetsbaar. We hebben echter geen aanwijzingen dat het lek daadwerkelijk misbruikt is om toegang tot onze systemen, of tot gegevens van gebruikers te krijgen. Het zou daarnaast, door onze specifieke serverconfiguratie, voor een aanvaller erg lastig zijn geweest om wachtwoorden van gebruikers te achterhalen.

Nadat we op de hoogte waren gesteld van het probleem zijn we meteen begonnen we met het upgraden van al onze systemen. Vervolgens hebben onze SSL-certificaten vervangen, en hebben we alle sessietokens gereset. Zie ook de volledige tijdlijn van alle de door Wikimedia ondernomen acties hieronder.

Alle ingelogde gebruikers sturen bij ieder verzoek aan de site een sessietoken mee. Als een aanvaller dat token van een gebruiker onderschept, dan kan deze daarmee voor die gebruiker uitgeven. Door het resetten van de sessietokens kan de aanvaller een token niet meer gebruiken. Bij het opnieuw inloggen wordt een versie van OpenSSL gebruikt waar het lek gerepareerd is, zodat deze aanval afgewend wordt.

Wij raden het aan om uw wachtwoord te veranderen als voorzorgsmaatregel, maar wij verplichten het niet dat alle gebruikers hun wachtwoorden veranderen. Er zijn geen sporen gevonden dat gebruikers het slachtoffer zijn geworden van deze bug, maar wij willen dat al onze gebruikers zo veilig als mogelijk zijn.

Dank u voor uw begrip en uw geduld.

Greg Grossmeier, namens de WMF Operations en Platform teams.

Tijdlijn van de door Wikimedia ondernomen acties

(Tijden zijn in UTC)

7 april:

8 april:

9 april:

10 april:

Veelgestelde vragen

(Deze sectie zal uitgebreid worden wanneer nodig is.)

  • Waarom is de “niet geldig vóór”-datum op jullie SSL certificaat niet veranderd als jullie het certificaat al vervangen hebben?
    Onze SSL certificaat-verstrekker houdt de oude “niet geldig vóór”-datum (soms foutief “verstrekt op”-datum genoemd) bij al de vervangen certificaten. Dit is niet ongewoon. Los van het kijken naar de verandering in de .pem bestanden waar naar wordt gelinkt boven de Tijdlijn, kan ook geverifieerd dat de verandering plaats heeft gevonden door de vingerafdruk van het nieuwe certificaat te vergelijken met onze vorige.

Français

Réponse de Wikimédia à la vulnérabilité de sécurité “Heartbleed

Logo pour l’anomalie Heartbleed (le cœur qui saigne).

Le 7 avril a été révélé un problème très répandu dans un composant central (OpenSSL) pour la sécurité sur Internet. La vulnérabilité a maintenant été corrigée sur tous les wikis de Wikimédia. Si vous ne faites que lire Wikimédia sans créer un compte, il n’est pas nécessaire que vous fassiez quoi que ce soit. Si vous avez un compte utilisateur sur un wiki de Wikimédia, vous devrez vous reconnecter la prochaine fois que vous utilisez votre compte.

Le problème, appelé « Heartbleed », pourrait permettre à des attaquants d’avoir accès à des informations privilégiées sur tout site exécutant une version vulnérable de ce logiciel. Les wikis hébergés par la Fondation Wikimédia ont été potentiellement affectés par cette vulnérabilité durant quelques heures après sa découverte. Cependant, nous n’avons aucune évidence d’une compromission effective de nos systèmes ou des informations de nos utilisateurs et, du fait de la façon particulière dont nos serveurs sont configurés, il aurait été très difficile à un attaquant d’exploiter cette vulnérabilité afin de récolter les mots de passe des utilisateurs sur nos wikis.

Dès que nous avons été avisés du problème, nous avons commencé à mettre à jour nos systèmes avec des versions corrigées du logiciel en question. Nous avons commencé à remplacer les certificats SSL critiques exposés à l’utilisateur et à réinitialiser tous les jetons de sessions d’utilisateurs. Consultez le calendrier complet de notre réponse ci-dessus.

Tous les utilisateurs connectés envoient un jeton secret de session avec chacune de leurs requêtes au site. Si une personne malveillante était en mesure d’intercepter ce jeton, elle pourrait se faire passer pour d’autres utilisateurs. La réinitialisation des jetons de tous les utilisateurs a l’avantage de faire se reconnecter tous les utilisateurs à nos serveurs en utilisant la version mise à jour et corrigée du logiciel OpenSSL, ce qui ôte cette attaque potentielle.

Nous recommandons de modifier votre mot de passe en tant que mesure standard de précaution, mais nous ne comptons pas actuellement forcer ce changement pour tous les utilisateurs. À nouveau, il n‘y a pas eu d’évidence que les utilisateurs des serveurs de la Fondation Wikimédia ont été la cible de cette attaque, mais nous voulons que nous utilisateurs soient dans une situation aussi sûre que possible.

Merci pour votre patience et votre compréhension.

Greg Grossmeier, au nom des équipes des opérations et plateforme de la Fondation Wikimédia.

Chronologie de la réponse de Wikimédia

(Les heures sont mentionnées dans le fuseau UTC.)

7 avril :

8 avril :

9 avril :

10 avril :

Foire aux questions

(Cette section sera étendue si nécessaire.)

  • Pourquoi la date « non valide avant » de notre certificat SSL n’a-t-elle pas été changée si vous l’avez déjà remplacée ?
    Notre fournisseur de certificat SSL conserve la date originale « non valide avant » (parfois incorrectement décrite comme la date « publiée le ») dans tout certificat remplacé. Ceci n’est pas une pratique rare. En dehors de la consultation des fichiers .pem liés dans la chronologie ci-dessus, l’autre façon de vérifier que le remplacement a eu lieu est de comparer l’empreinte numérique de notre nouveau certificat avec la précédente.

‎українська

Звернення Вікімедіа щодо вразливості безпеки через помилку “heartbleed”

Логотип помилки “Heartbleed”

7го квітня була розкрита широко поширена помилка у центральному компоненті інтернет-безпеки (OpenSSL). Вразливість вже виправлена на всіх проектах Вікімедії. Якщо ви тільки читаєте вікіпедію та не маєте аккаунта, від вас нічого не вимагається. Якщо ж ви маєте аккаунт на будь-якому проекті Вікімедії, ви повинні перелогінитись наступного разу, коли використовуватиме свій аккаунт.

Помилка, яка називається “Heartbleed”, дозволяє хакерам отримати доступ до привілейованої інформації будь-якого сайту, який використовує вразливу версію цього програмного забезпечення. Проекти Вікімедії потенціально могли бути вражені цією помилкою кілька годин після того, як вона була знайдена. Однак, в нас немає жодних доказів шкоди нашим системам чи інформації користувачів. І тому, конфігурація наших серверів зробила використовування цієї помилки хакерами для викрадення паролів дуже складним.

Після того, як ми дізналися про помилку, ми почали оновлювати всі системи виправленою версією програмного забезпечення. Потім ми почали заміняти критичні SSL-сертифікати, що контактують з користувачами, та завершили всі сессії користувачів. Дивіться повну хронологію наших дій нижче.

Всі залогінені користувачі надсилають секретний сесійний токен на кожне звернення до сайту. Якщо нечесна людина перехопить цей токен, вона зможе представлятися іншими користувачами. Скидання цих токенів для всіх користувачів гарне тим, що змушує всіх користувачів перепідключатися до наших серверів використовуючи оновлену та виправлену версію програмного забезпечення OpenSSL, що унеможливлює потенціальні атаки.

Ми рекомендуємо змінити ваш пароль як стандартний профілактичний захід, але ми, на данний час, не збираємося змушувати це робити всіх користувачів. Повторюємо, не було ніяких доказів, що користувачі Вікімедії були атаковані, але ми хочемо щоб всі користувачі були настільки захищені, наскільки це можливо.

Дякуємо за розуміння та терпіння.

Грег Гроссмейер, від імені команд операцій та платформ Вікімедії.

Хронологія дій Вікімедії

(Час вказано в UTC)

7 квітня:

8 квітня:

9 квітня:

10 квітня:

Поширені питання

(Цей розділ буде розширений при необхідності.)

  • Чому “не дійсні до” дати на вашому сертифікаті SSL не змінились, якщо ви вже замінили його?
    Наш постачальник SSL-сертифікатів зберігає справжню “не дійсне до” дату (яку іноді плутають з “видається на” датою) на всіх замінених сертифікатах. Це не непоширена практика. Окрім вивчення змін в .pem-файлах, посилання на які є зверху в хронології, інший спосіб впевнитись, що заміна дійсно була – це порівняти відбитки пальців на новому та старому сертифікатах.

‎asturianu

Respuesta de Wikimedia a la escalabradura de seguridá “Heartbleed”

Logo del bug Heartbleed

El 7 d’abril revelóse un problema estendíu nun componente central de la seguridá d’Internet (OpenSSL). Yá ta iguada esta escalabradura en toles wikis de Wikimedia. Si sólo llee Wikipedia sin crear nenguna cuenta, nun necesita facer nada. Si tien una cuenta d’usuari en cualquier wiki de Wikimedia, tendrá qu’aniciar una sesión nueva la próxima vez qu’use la so cuenta.

El problema, llamáu Heartbleed, permitiría a unos atacantes ganar accesu a información privilexada en cualquier sitiu qu’execute una versión frañada d’esi software. Les wikis agospiaes pola Fundación Wikimedia tuvieron afeutaes demientres delles hores después de que s’espublizara esti fallu. Sicasí, nun tenemos evidencia nenguna de que los nuestros sistemes o la información de los usuarios pudieran tar comprometíos y, pola forma particular en que tenemos configuraos los nuesos sirvidores, sedría mui difícil pa un atacante esplotar el fallu pa collechar les contraseñes de los usuarios de la wiki.

Después de conocer el problema, principiamos por anovar tolos sistemes con versiones iguáes del software en cuestión. Darréu, empezamos a sustituir los certificaos SSL críticos cara al usuariu y reaniciar tolos pases de sesión d’usuariu. Vea más abaxo un diagrama temporal completu de la nuesa respuesta.

Tolos usuarios rexistraos unvien un pase de sesión secretu con cada solicitú al sitiu. Si una persona fuina pudiera interceptar esi pase, podría suplantar a otros usuarios. El reaniciu de los pases pa tolos usuarios tien la ventaya de facer que tolos usuarios vuelvan a coneutase a los sirvidores usando la versión anovada ya iguada del software OpenSSL, torgando asina esti ataque potencial.

Recomendamos que cambie la contraseña como midida de precaución estándar, pero nesti momentu nun tenemos la intención d’obligar a tolos usuarios a facelo. Insistimos en que nun hai evidencia de que los usuarios de la Fundación Wikimedia fueran blanco d’esti ataque, pero queremos que tolos nuesos usuarios tean lo más seguros posible.

Gracies pola so comprensión y paciencia.

Greg Grossmeier, nel nome de los equipos d’Operaciones y Plataforma de la Fundación Wikimedia.

Diagrama temporal de la respuesta de Wikimedia

(Les hores tan en UTC)

7 d’abril:

8 d’abril:

9 d’abril:

10 d’abril:

Entrugues frecuentes

(Esta sección s’espanderá según se necesite).

  • ¿Por qué nun cambió la data «non válidu antes de» del certificáu SSL si yá lu trocaron?
    El nuesu fornidor de certificáu SSL caltién la data orixinal «non válidu antes de» (incorreutamente llamada dacuando data de «asoleyáu el») en cualquier certificáu trocáu. Esto nun ye una práctica estraña. Amás de mirar el cambiu de los ficheros .pem enllazaos más arriba na cronoloxía, la otra manera de comprobar que tuvo llugar el cambiu ye comparar la buelga dixital del certificáu nuevu cola del anterior.

‎中文

维基媒体基金会对于“心脏出血”(Heartbleed)安全漏洞的回复

在4月7日,保护互联网安全的基础程序之一OpenSSL被曝出有一个影响广泛的安全漏洞。维基媒体基金会已经修复了所有维基媒体站点的漏洞。如果您仅仅是维基百科(Wikipedia)的读者并且没有创建过账号,那么这一漏洞不会对您造成影响。如果您是维基媒体项目的注册用户,那么您需要在下次使用账户时重新登录。

这个问题被命名为心脏出血漏洞,它会允许网络攻击者从任何有此安全漏洞的网站上窃取被加密的信息。在此问题被揭露的数小时之后,维基媒体基金会下的维基站点都受到了该威胁的潜在影响。然而,我们没有证据表明我们的系统和用户信息受到影响;并且由于我们配置服务器的特殊方式,攻击者也难以通过此漏洞来窃取用户的维基账户密码。

自从发现了这个问题后,我们已经着手将系统软件升级至打上补丁后的版本。并且我们替换了关键的面向用户的SSL证书并且重设了所有用户会话令牌。详细时间表请参见下文。

所有登录的用户在向网站发送请求时,都发送了一个秘密的会话令牌。如果一个有恶意的人能够截获该令牌,他们就能以其他用户的名义操作。重置所有的用户会话令牌,可以确保所有用户在与我们服务器重新连接时,使用更新后的OpenSSL软件,从而阻止这一可能的攻击。

我们建议更换您的密码(这是一种标准的预防措施),但是我们目前不会强制所有用户更改密码。目前没有任何证据显示维基媒体基金会的用户遭受攻击,但是我们希望我们的所有用户能尽量安全。

感谢您的理解与耐心。

Greg Grossmeier,代表维基媒体基金会维护及平台团队。

维基媒体回应的时间表

(时间为UTC时间)

4月7日:

4月8日:

4月9日:

4月10日:

常见问题

(此章节在需要情况下会被扩充)

    • 在你们更换了SSL证书后,为什么你们的SSL证书的“生效日期”的具体时间没有更新?
      我们的SSL证书提供商不改变任何已更换过的证书的“生效日期”(有时被误称为“发行”日)。这不是一种罕见的做法。除了查看上文提到的对.pem文件的更改,另一种查看更改的方式是比对我们新旧证书的指纹。

‎Bahasa Indonesia

Tanggapan Wikimedia mengenai kerentanan keamanan “Heartbleed”

Logo untuk bug Heartbleed

Pada tanggal 7 April, isu menyebar mengenai komponen utama dari keamanan internet (OpenSSL) diberitahukan. Kerentanan ini sudah diperbaiki di seluruh wiki milik Wikimedia. Bila Anda hanya membaca Wikipedia tanpa membuat akun pengguna, tidak ada yang dibutuhkan dari Anda. Bila Anda membuat akun pengguna di seluruh wiki milik Wikimedia, Anda harus re-login disaat Anda selanjutnya menggunakan akun pengguna Anda.

Isu utamanya, disebut Heartbleed, memperbolehkan penyerang mendapatkan akses ke informasi khusus di semua situs yang lemah terhadap perangkat lunak tersebut. Wiki yang di host oleh Yayasan Wikimedia memiliki potensial untuk terpengaruh dengan kelemahan ini selama beberapa jam setelah kerentanan ini diberitahukan. Biarpun begitu, kami tidak memiliki bukti bahwa ada ada masalah kepada system kami untuk informasi pengguna, dan dikarenakan server kami memiliki konfigurasi khusus sendiri, akanlah sangat sulit untuk penyerang mengeksploit kerentanan ini untuk mengambil password akun pengguna.

Setelah kami mengetahui adanya isu ini, kami memulai memutakhirkan semua system kami dengan versi patch dari software yang dipermasalahkan. Kami selanjutnya mengganti sertifikat pengguna SSL yang kritikal dengan me-reset ulang semua sesi token. Lihat semua garis waktu respon dibawah.

Semua pengguna yang log-in menerima sesi token rahasia dengan setiap permintaan ke situs. Bila ada orang denga maksud buruk mampu menangkap token tersebut, mereka dapat berpura pura meniru pengguna lain. Me-reset ulang token untuk semua pengguna memiliki keuntungan membuat semua pengguna menghubungkan ulang ke server kami menggunakan perangkat lunak yang sudah dimutakhirkan dan sudah diperbaiki, alhasil menghapus kemungkinan penyerangan ini.

Kami merekomendasikan untuk mengganti password Anda untuk tindakan pencegahan, tetapi kami tidak bermaksud untuk memaksa penggantian password untuk semua pengguna. Sekali lagi, tidak ada bukti pengguna Yayasan Wikimedia menjadi sasaran untuk serangan ini, tetapi kami menginginkan semua pengguna untuk tetap dalam keadaan aman.

Terima kasih untuk kesabaran dan pengertiannya.

Greg Grossmeier, atas nama tim operasi WMF dan tim platform

Garis waktu respon Wikimedia

(waktu menggunakan UTC)

7 April:

8 April:

9 April:

10 April:

Pertanyan yang Sering Diajukan

(Bagian ini akan dikembangkan bila dibutuhkan.)

    • Mengapa “tidak valid sebelumnya” pada penanggalan sertifikat SSL diganti bila kamu (WMF) telah menggantinya?
      Penyedia sertifikat SSL kami tetap menggunakan tanggal “tidak valid sebelumnya” (terkadang salah menunjukan “tanggal” isu) di semua sertifikat yang telah diganti. Ini bukanlah praktek yang tidak umum. Selain dari melihat perubahan pada berkas .pem yang dihubungkan di garis waktu di atas, jalan lain untuk memverifikasi penggantian adalah membandingkan “sidik jari” pada sertifikat baru kami dengan sertifikat yang lama.

‎čeština

Jak reagovala Wikimedia na bezpečnostní riziko “Heartbleed”

Logo pro chybu Heartbleed

Dne 7. dubna byl odhalen rozsáhlý problém v centrální součásti internetové bezpečnosti (OpenSSL). Tato chyba zabezpečení je nyní na všech wikistránkách nadace Wikimedia opravena. Pokud chcete pouze číst Wikipedii bez vytvoření účtu, nemusíte dělat nic. Máte-li uživatelský účet na kterékoli wikistránce nadace Wikimedia, budete se před příštím použitím svého účtu muset znovu přihlásit.

Problém nazvaný Heartbleed by umožnil útočníkům získat přístup k privilegovaným informacím na libovolné stránce běžící ve zranitelné verzi tohoto softwaru. Wikistránky, jejichž hostitelem je nadace Wikimedia, byly potenciálně ovlivněny touto chybou zabezpečení po dobu několika hodin poté, co byla odhalena. Nicméně nemáme žádné důkazy o skutečném ohrožení našich systémů a informací o našich uživatelích, a protože naše servery jsou konfigurovány zvláštním způsobem, bylo by pro útočníka velmi obtížné tuto chybu zabezpečení zneužít ke zcizení uživatelských hesel.

Poté, co jsme o tomto problému byli uvědoměni, začali jsme do všech našich systémů instalovat opravenou verzi daného softwaru. Pak jsme začali nahrazovat ohrožené SSL certifikáty uživatelů a resetovat všechny uživatelské znaky pověření. Podívejte se na časový průběh naší reakce níže.

Všichni přihlášení uživatelé s každým požadavkem na web posílají tajný znak pověření pro relaci. Pokud by někdo byl schopný zachytit tento znak, mohl by se vydávat za jiného uživatele. Resetování znaků pro všechny uživatele má tu výhodu, že se všichni uživatelé musí znovu připojit k našim serverům prostřednictvím aktualizované a opravené verze softwaru OpenSSL, čímž se tento potenciální útok znemožní.

Doporučujeme změnu hesla jako standardní preventivní opatření, ale nemáme v současné době v úmyslu prosazovat změnu hesla pro všechny uživatele. Skutečně neexistuje žádný důkaz, že by se uživatelé stránek nadace Wikimedia stali terčem tohoto útoku, ale chceme jim zajistit co největší bezpečnost.

Děkujeme za vaše porozumění a vaši trpělivost.

Greg Grossmeier, jménem skupin WMF Operations and Platform

Časový průběh reakce Wikimedia

(Časy jsou v UTC)

7. dubna

8. dubna:

9. dubna:

10. dubna:

Často kladené otázky

(Tento oddíl se bude dle potřeby rozšiřovat.)

  • Proč se datum “neplatné před” u vašeho SSL certifikátu nezměnilo, i když jste již změnu provedli?
    Náš poskytovatel certifikátu SSL udržuje původní datum “neplatné před” (někdy nesprávně označované za datum “vydáno”) ve všech změněných certifikátech. To není neobvyklá praxe. Kromě možnosti podívat se na změny v souborech .PEM, na které se odkazuje v časovém průběhu výše, dalším způsobem ověření, že náhrada proběhla, je porovnat otisk našeho nového certifikátu s naším předchozím.

‎Magyar

A Wikimédia válasza a “Vérző szív” biztonsági sebezhetőségre

A Vérző szív hiba logója

Április 7-én egy széles körben elterjedt hibát fedeztek fel az OpenSSL internet-biztonsági program egy központi komponensében. A sebezhetőséget javítottuk az összes Wikimédia wikin. Ha a Wikipédiát felhasználói fiók nélkül használod, további teendőd nincs. Ha van felhasználói fiókod bármely Wikimédia wikin, újra be kell jelentkezned, amikor legközelebb használni akarod a fiókot.

A “Vérző szívnek” nevezett hiba kihasználásával a támadók bizalmas információkhoz férhetnek hozzá a szoftver sebezhető változatát futtató weboldalakon. A hiba felfedezését követően a Wikimédia Alapítvány által működtetett wikik néhány óráig ki voltak téve ennek a veszélynek. Ugyanakkor nem találtuk semmi jelét annak, hogy rendszereink vagy a felhasználóink adatai ilyen támadás áldozatává váltak volna, továbbá szervereink speciális konfigurációjának köszönhetően nagyon nehéz lett volna egy támadó számára a sérülékenység kihasználása és a felhasználók wikis jelszavainak megszerzése.

Amikor értesültünk a sebezhetőségről, az összes rendszerünket elkezdtük frissíteni a szóban forgó szoftver javított változatára. Ezután lecseréltük a kritikus, a felhasználók által látható SSL tanúsítványokat, és megszakítottuk az összes munkamenetet. Az események pontos lefolyását lásd lentebb.

Minden bejelentkezett felhasználó böngészője egy titkos tokent (azonosító kódot) küld az oldalnak minden lapletöltéskor. Ha egy rosszindulatú személy megszerezné ezt a tokent, el tudná hitetni az oldallal, hogy ő az adott felhasználó. A tokenek cseréje használhatatlanná teszi az esetlegesen már ellopott tokeneket, és rákényszeríti a felhasználókat, hogy újra bejelentkezzenek, az új, biztonságos OpenSSL szoftvert használva.

Nem áll szándékunkban, hogy minden felhasználót kényszerítsünk jelszavának megváltoztatására, mégis azt javasoljuk, hogy elővigyázatosságból változtasd meg a jelszavadat. Nincs jele annak, hogy a Wikimédia Alapítvány felhasználói ellen támadás irányult volna, de azt szeretnék, ha minden felhasználónk a legnagyobb biztonságban érezhetné magát.

Köszönjük a megértésedet és a türelmedet.

Greg Grossmeier, a WMF üzemeltetési és platform csapatának nevében

A Wikimédia ellenintézkedéseinek pontos menete

(Az időpontok UTC idő szerint értendők)

Április 7:

Április 8:

Április 9:

SSL tanúsítványának cseréje] (ez az utolsó tanúsítvány)

Április 10:

Gyakran Ismételt Kérdések

(Ez a szakasz még bővülhet.)

  • Ha lecseréltétek az SSL tanúsítványokat, miért nem változott a “not valid before” (legkorábbi érvényesség) dátumuk?
    Az SSL-tanúsítvány-szolgáltatónk megőrzi az eredeti “not valid before” dátumot (amit néha tévesen kibocsájtási dátumnak neveznek), ha lecserél egy tanúsítványt. Ez egy bevett szokás. Az eseménysorban lévő linken látható, hogy a .pem fájlok megváltoztak, és akkor is, ha a régi és az új ujjlenyomatot összehasonlítod.

‎Bahasa Melayu

Reaksi wikimedia kepada lemah keselamatan internet yang “Hati berdarah”

Logo for the Heartbleed bug

Pada 7 April, suatu isu luas bangkit bagi keselamatan Internet (OpenSSL) telah diketahui orang. Isu kelemahan ini semakin dibetulkan oleh semua Wikimedia wikis. Jika anda hanya membaca wikipedia tanpa akaun, tidak ada apa yang perlu daripada anda. Jika anda mempunyai akaun pengguna dalam Wikimedia wiki, anda perlu re-login lain kali apabila mengguna akaun anda.

Isu ini dipanggil Heartbleed, membolehkan penyerang internet mendapat akses ke informasi keutamaan dari program komputer yang berversi kurang kekuatan. Pengguna iaitu Wikis, yang ditaja oleh Wikimedia Foundation telah didapati berpotensi untuk dipengaruhi oleh kekurangan tersebut setelah ianya diketahui dalam beberapa jam lalu. Walaupun demikian, pihak kami masih tidak dapat apa bukti dari sistem kami ataupun pengguna kami, dan oleh sebab cara kami memperadukkan server pihak kami, ini telah menyusahkan penyerang-penyerang untuk mendapat seberang kata laluan pengguna wiki.

Setelah tersedar dengan isu tersebut, kami mulai menaik taraf sistem dengan versi program yang dibaiki dalam situasi tertanyaan. Kami bermula dengan pertukaran sijil SSL utama (tingkat pengguna) dan pertukaran semua sessi token. Lihat semua tali masa daripada tindakbalas kami seperti di bawah.

Semua daftar masuk pengguna mengirimkan sessi parameter rahsia yang dikehendaki oleh website tersebut. Jika penjahat dapat menghentikan sessi token itu, ia juga dapat menirukan diri sebagai pengguna lain. Pertukaran sessi token untuk semua pengguna mempunyai kebaikan bagi menghubungkan pengguna dengan memakai versi OpenSSL program yang telah naik taraf dan diperbaiki, tambahan menolakan potensi diserang.

Kami mencadangkan pertukaran kata laluan anda sebagai pencegahan ini, tetapi bukan sekaranglah yang melaksanakan pertukaran kata laluan untuk semua pengguna. Tambahan jua pihak kami tidak mempunyai bukti pengguna Wikimedia Foundation menjadi matlamat diserang, tetapi kami ingin para pengguna kami dalam situasi yang selamat.

Ribuan terima kasih.

Greg Grossmeier, wakil WMF Operation and Platform teams

Tali masa untuk Wikimedia reaksi

(Masa dalam UTC)

7, April

8, April

9, April

10, April

Soalan yang sering ditanya

(Sessi ini akan dipanjangkan masa seperti yang dikehendaki.)

  • Mengapakah ” tidak sah sebelum” tarikh di SSL sijil anda bertukar kalau anda sudah menukarkannya?
    Sijil SSL pemberi kami tidak menukar tarikh asal “tidak sah sebelum” (kadang salah merupakan “tarikh keluar”) di dalam sijil yang tertukar. Ini bukan suatu keadaan luar biasa. Selain itu, pertukaran di .pem fail yang dihubungi dengan tali masa atas, cara lain membuat verifikasi atas gantian ialah perbandingan antara baru dengan yang dulu.

‎Português do Brasil

Resposta da Wikimedia à vulnerabilidade de segurança “Heartbleed”

Logo do bug Heartbleed

Em 7 de abril, foi revelado um problema generalizado em um componente central da segurança na internet (OpenSSL). A vulnerabilidade já foi corrigida em todas as wikis da Wikimedia. Se você apenas lê a Wikipédia sem criar uma conta de usuário, não é necessário que você faça nada. Se você possui uma conta em qualquer uma das wikis, você deve logar-se novamente da próxima vez que acessá-la.

Esse problema, chamado de Heartbleed, permitiria que pessoas mal intencionadas ganhassem acesso à informações privilegiadas em qualquer site que possuísse uma versão vulnerável do programa. Wikis da Fundação Wikimedia estiveram potencialmente afetadas por essa vulnerabilidade por várias horas após a divulgação da falha. Entretanto, não tivemos evidências de nenhum comprometimento em nossos sistemas ou nas informações de nossos usuários, e devido à forma particular como nossos bancos de dados são configurados, seria bem difícil para um hacker explorar a vulnerabilidade com o intuito de adquirir as senhas de usuários das wikis.

Após ficarmos cientes da vulnerabilidade, começamos a atualizar nossos sistemas com versões corrigidas do software em questão. Em seguida, começamos a substituir os certificados SSL críticos e resetamos todos os tokens de sessão dos usuários. Veja o cronograma completo da nossa resposta abaixo.

Todos os usuários logados enviam um token secreto a cada pedido ao site. Se uma pessoa mal-intencionada fosse capaz de intercetar este token, poderia fazer-se passar por esse usuário. Ao resetar todos os tokens, forçamos os usuários a se logarem aos nossos servidores utilizando a versão atualizada e corrigida do software OpenSSL, eliminando assim a possibilidade deste ataque.

Nós recomendamos que você altere sua senha como um ato de precaução, todavia não é nossa pretensão que essa mudança de senha seja para todos os usuários. Novamente, não houve evidência de que os usuários Wikimedia Foundation tenha sido atingidos por esse ataque, mas desejamos que todos os nossos usuários estejam tão seguros quanto possível.

Agradecemos pela sua compreensão e paciência.

Greg Grossmeier, no interesse da equipe de Operações e Plataformas da WMF

Linha do tempo da reposta da Wikimedia

(Os horários estão em UTC)

7 de abril:

8 de abril:

9 de Abril:

10 de Abril:

Perguntas mais frequentes

(Esta secção será expandida conforme necessidade)

  • Porque é que a data “não válido antes de” não mudou no certificado SSL se vocês já o substituíram?
    O nosso provedor de certificados SSL conserva a data original “não válido antes de” (algumas vezes chamado, ainda que incorretamente, de “publicado em”) em qualquer certificado substituído. Não se trata de uma prática incomum. Para além de observar as mudanças nos arquivos .pem linkados acima na linha do tempo, pode-se também verificar que a substituição foi efetivada comparando a assinatura digital (fingerprint) do novo certificado com a do anterior.

‎ Português

Resposta da Wikimedia à vulnerabilidade de segurança “Heartbleed”

Logótipo do erro Heartbleed

A 7 de abril, revelou-se um problema generalizado num componente central da segurança da Internet (OpenSSL). A vulnerabilidade já foi corrigida em todas as wikis da Wikimedia. Se só lê a Wikipédia sem se ter registado, nada lhe é pedido. Se tem uma conta de utilizador numa wiki Wikimedia, vai ser necessário que volte a iniciar sessão na próxima vez que a utilizar.

O problema, denominado Heartbleed, permitia que utilizadores mal intencionados tivessem acesso a dados sensíveis em qualquer site que estivesse a utilizar uma versão vulnerável deste software. As Wikis hospedadas pela Wikimedia estiveram vulneráveis durante várias horas depois da comunicação desta falha. Contudo, não temos qualquer prova de que os nossos sistemas ou os dados dos nossos utilizadores tenham sido afetados, e dada a forma como os nossos servidores estão configurados, teria sido muito difícil explorar esta vulnerabilidade de modo a roubar passwords dos nossos utilizadores.

Depois de termos tido conhecimento desta vulnerabilidade, começámos a atualizar todos os nossos sistemas com versões corrigidas. A seguir, começámos a substituir os certificados SSL críticos e redefinimos todos os tokens de sessão de utilizador. Veja o cronograma da nossa resposta abaixo.

Todos os utilizadores com sessão iniciada enviam um token secreto a cada pedido ao site. Se uma pessoa mal-intencionada fosse capaz de intercetar este token, poderia fazer-se passar por esse utilizador. Ao repor todos os tokens, forçamos os utilizadores a ligarem-se aos nossos servidores utilizando a versão atualizada e corrigida do firmware OpenSSL, eliminando assim a possibilidade deste ataque.

Recomendamos que altere a sua password como forma de precaução padrão, mas não tencionamos impor esta mudança a todos os utilizadores. Novamente, não há nenhum indício que indique que os utilizadores da Fundação Wikimedia tenham sido atacados, mas queremos que todos os nossos utilizadores estejam tão seguros quanto possível.

Obrigado pela sua compreensão e paciência.

Greg Grossmeier, em nome das equipas da WMF Operations and Platform

Cronologia da resposta da Wikimedia

(Os horários estão em UTC)

7 de abril:

8 de abril:

9 de Abril:

10 de Abril:

Perguntas mais frequentes

(Esta secção será expandida caso seja necessário)

  • Porque é que a data “não válido antes de” não mudou no certificado SSL se já o substituíram?
    O nosso provedor de certificados SSL conserva a data original “não válido antes de” (algumas vezes chamado, ainda que incorretamente, “publicado em”) em qualquer certificado substituído. Não se trata de una prática incomum. Para além de observar as mudanças nos ficheiros .pem que têm um link acima, na cronologia, pode-se também verificar que a substituição foi levada a cabo comparando a assinatura digital (fingerprint) do novo certificado com a do anterior.

‎русский

Обращение Викимедиа по поводу уязвимости безопасности из-за ошибки “heartbleed”

Логотип ошибки Logo for the “Heartbleed”

7го апреля была раскрыта широко распространенная ошибка в центральном компоненте интернет-безопасности (OpenSSL). Уязвимость уже исправлена ​​на всех проектах Викимедиа. Если вы только читаете википедию и не имеете аккаунта, от вас ничего не требуется. Если же ві имеете аккаунт на любом проекте Викимедиа, вы должны перелогиниться в следующий раз, когда будете использовать свой ​​аккаунт.

Ошибка, которая называется “Heartbleed”, позволяет хакерам получить доступ к привилегированной информации любого сайта, который использует уязвимую версию этого программного обеспечения. Проекты Викимедиа потенциально могли быть поражены этой ошибкой несколько часов после того, как она была найдена. Однако, у нас нет никаких доказательств вреда нашим системам или информации пользователей. И поэтому, конфигурация наших серверов сделала использование этой ошибки хакерами для кражи паролей очень сложным.

После того, как мы узнали об ошибке, мы начали обновлять все системы исправленной версией программного обеспечения. Потом мы начали заменять критические SSL-сертификаты, контактирующие с пользователями, и завершили все сессии пользователей. Смотрите полную хронологию наших действий ниже.

Все залогиненые пользователи присылают секретный сессионный токен на каждое обращение к сайту. Если нечестный человек перехватит этот токен, он сможет представляться другими пользователями. Сброс этих токенов для всех пользователей хорош тем, что заставляет всех пользователей переподключаться к нашим серверам используя обновленную и исправленную версию программного обеспечения OpenSSL, что исключает потенциальные атаки.

Мы рекомендуем изменение пароля как стандартную профилактическую меру, но мы, к настоящему времени, не собираемся заставлять это делать всех пользователей. Повторяем, не было никаких доказательств, что пользователи Викимедиа были атакованы, но мы хотим, чтобы все пользователи были настолько защищены, насколько это возможно.

Спасибо за ваше понимание и терпение.

Грег Гроссмейер, от имени команд операций и платформ Викимедии.

Хронология действий Викимедиа

(Время указано в UTC)

7ое апреля:

8ое апреля:

9ое апреля:

10ое апреля:

Часто Задаваемые Вопросы

(Этот раздел будут увеличиваться по мере необходимости.)

  • Почему “не действительны до” даты на вашем сертификате SSL не изменились, если вы уже заменили его?
    Наш поставщик SSL-сертификатов хранит настоящую “не действительны до” дату (которую иногда путают с “выдается на” датой) на всех замененных сертификатах. Это не нераспространённая практика. Кроме изучения изменений в .pem-файлах, ссылки на которые есть сверху в хронологии, другой способ убедиться, что замена действительно была – это сравнить отпечатки пальцев на новом и старом сертификатах.

தமிழ்

“இதயகசிவு”-எனும் பாதுகாப்பு பாதிப்பிற்கு விக்கிமீடியாவின் எதிர்ச்செயல்

இதயகசிவுஇலச்சினை

ஏப்ரல் 7-ம் திகதி ஓப்பன் எஸ்.எஸ்.எல் என்னும் இணைய கருவியில் ஏற்பட்ட பாதுகாப்பு பாதிப்பு, தற்போது விக்கிமீடியாவின் விக்கிகளில் சரிசெய்யப்பட்டுள்ளது. நீங்கள் விக்கிமீடியாவின் கணக்குத் துவங்கியிருந்தால் உங்களுடைய கணக்கில் இருந்து ஒரு முறை வெளியேறி பின் புகுபதிகை செய்யவும். உங்களுக்கான பயனர் கணக்கு இல்லையெனில் எதுவும் செய்யத் தேவையில்லை.

Heartbleed | இதயகசிவு என்னும் இப்பிணக்கு கணக்கு விவரங்களை தவறாக பயன்படுத்த வழிவகை செய்கிறது. விக்கிமீடியாவின் விக்கிகள் இப்பிணக்கிற்கு பல மணி நேரம் ஆளாகி உள்ளனவா என்று தெரியவில்லை. விக்கி திட்டங்களில் கடவுச்சொல் பாதுகாப்பான முறையில் இருந்த போதும், இது நடந்துள்ளது.

இப்பிணக்கை அறிந்தவுடன் விக்கிமீடியா திட்டங்களில் அதற்கான பாதுகாப்பு நடவடிக்கை எடுக்கப்பட்டுள்ளது. தற்போது இணையதளத்திற்கான எஸ். எஸ். எல் மற்றும் பயனர் தகவல் சேர்ப்பான் மாற்றப்பட்டு வருகிறது. முழு நடவடிக்கைக்கான காலக்கோடு கீழே கொடுக்கப்பட்டுள்ளது.

ஒவ்வொரு முறை நீங்கள் ஒரு பக்கத்தினை திறக்கும்போது, அனைத்து புகுபதிகை செய்யப்பட்ட பயனர்களும் தங்களுக்குறிய தகவல் சேர்ப்பானும் அனுப்பப்படும். யாராவது புல்லுருவிகள் இந்த தகவல் சேர்ப்பானை இடைமறித்து தகவல்களை பெற இந்த இதயகசிவு வழு வழிவகை செய்கிறது. இதை தவிர்க்க அனைத்து பயனர்களின் தகவல்களையும் ஒரே நேரத்தில் மாற்ற விக்கிமீடியா நிறுவனம் முடிவு செய்துள்ளது.

முன்கட்ட நடவடிக்கையாக கடவுச்சொற்களை மாற்ற விக்கி பரிந்துரை செய்கிறது. விக்கித்திட்டங்களில் இதனுடைய பாதிப்பு இருப்பதாகத் தெரியவில்லை ஆயினும் விக்கிமீடியா நிறுவனம் பாதுகாப்புகளை அதகரிக்க முடிவெடுத்துள்ளது.

தங்களின் புரிந்துணர்வுக்கும் அமைதிக்கும் நன்றி.

க்ரெக் க்ராஸ்மெயர், விக்கிமீடியா நிறுவனம் சார்ப்பாக

விக்கிமீடியாவி எதிர்ச்செயல் காலக்கோடு

கால நேரம் (ஒ.ச.நே.)

ஏப்ரல் 7:

ஏப்ரல் 8:

ஏப்ரல் 9:

ஏப்ரல் 10:

அடிக்கடி கேட்கப்படும் கேள்விகள்

(இந்தப் பிரிவு தேவைக்கேற்ப விரிவுபடுத்தப்படும்.)

  • ஏற்கனவே எஸ். எஸ். எல் மாற்றப்பட்ட போதும் ஏன் “not valid before” தேதி ஏன் மாற்றப்படவில்லை?
    நமக்கு எஸ். எஸ். எல் வழங்குநர்கள் சில நேரங்களில் “not valid before” திகதி புதிய எஸ். எஸ். எல்-இல் மாற்றப்படவில்லை. இரண்டு எஸ். எஸ். எல்-ற்கும் வித்தியாசத்தை உணர .pem கோப்புகளை பார்க்கலாம் அல்லது இரண்டு எஸ். எஸ். எல் ரேகையையும் ஒப்பிட்டு பார்த்தால் மாற்றம் தெரியவரும்.

‎한국어

위키미디어의 “Heartbleed” 보안 취약성 대응

4월 7일 인터넷 보안의 핵심 요소(OpenSSL)에서 문제가 발견되었습니다. 그 취약성은 이제 모든 위키미디어 위키에서 해결되었습니다. 위키백과 계정을 만들지 않고 단순히 읽기만 한다면 따로 할 일이 없습니다. 어떠한 위키미디어 위키에라도 계정이 있다면 다음에 그 계정을 사용할 때 다시 로그인해야 합니다.

Heartbleed라고 불리는 이 버그는 공격자가 취약성이 있는 그 소프트웨어를 실행하는 사이트에 있는 비공개 정보를 취득할 수 있게 합니다. 위키미디어 재단에서 호스팅하는 위키에서는 이 취약성이 발견되고 몇 시간 동안 일시적으로 공격을 받을 수 있었습니다. 그러나 실제로 재단의 시스템이나 사용자 정보를 공격한 어떠한 증거도 없습니다. 재단 서버가 특별하게 구성되었기 때문에 공격자가 사용자의 위키 비밀번호를 취득하기 위해 그 취약성을 악용하기는 매우 힘들었을 것입니다.

재단에서 이 문제를 발견하고 문제가 있는 소프트웨어를 패치된 버전으로 재단의 모든 시스템을 업그레이드하기 시작했습니다. 그러고나서 중요한 사용자측 SSL 인증서를 대체하고 모든 사용자 세션 토큰을 초기화하기 시작했습니다. 아래에서 재단이 어떻게 대응했는지 그 경과를 보십시오.

모든 로그인한 사용자는 사이트에 요구할 때마다 비밀 세션 토큰을 전송합니다. 악의적인 사람이 그 토큰을 가로챌 수 있다면 다른 사용자를 사칭할 수 있을 것입니다. 모든 사용자의 토큰을 초기화하면 모든 사용자가 갱신되어 수정된 OpenSSL 소프트웨어를 사용해 다시 연결하게 하는 이점이 있으며 이는 잠재적인 공격을 방지합니다.

재단에서는 일반적인 사전 예방 조치로 사용자 비밀번호를 변경할 것을 권하지만 모든 사용자에게 강제로 비밀번호를 변경하게할 의도는 없습니다. 다시 한번 말하지만 위키미디어 재단 사용자가 이 공격의 대상이 되었다는 증거는 없지만 재단에서는 모든 사용자가 되도록 안전하기를 바랍니다.

사용자 여러분이 양해하고 참아주셔서 감사합니다.

WMF 운영 및 플랫폼 팀 Greg Grossmeier

위키미디어 대응 경과

(시간은 UTC)

4월 7일

4월 8일

4월 9일

4월 10일

자주 묻는 질문

(필요하면 추가됩니다.)

  • 사용자 SSL 인증서를 대체했다면 그 인증서의 “이전 무효” 날짜가 바뀌지 않은 이유는?
    재단의 SSL 인증서 제공자는 원래의 “이전 무효” 날짜(“발행” 날짜로 잘못 지칭되기도 함)를 대체 인증서에서도 유지합니다. 이는 특별한 관행이 아닙니다. 위의 경과에서 링크된 .pem 파일의 변화를 보는 외에 대체가 되었음을 확인하는 나머지 방법은 새 인증서의 식별자를 이전의 식별자와 비교하는 것입니다.

‎日本語

Heartbleed 問題へのウィキメディアの対応

Heartbleed bugのロゴマーク

4月7日、インターネット上のセキュリティの中心的要素であるOpenSSLに、広範な影響をもたらす問題があることが分りました。この脆弱性は全ウィキメディアのウィキで修正済みです。アカウントを作らずにウィキペディアを読んでいるだけの方は、なにも対処する必要はありません。ウィキメディアのウィキでアカウントを持っている方は、次回アカウントを使う際に再ログインが必要です。

このOpenSSLの特定のバージョンには、制限された情報が読み取られる脆弱性があり、それを使っているどんなウェブサイトでも攻撃が可能となってしまっています。この問題は、Heartbeat(心拍の意)という機能のバグから来ているため、Heartbleed(心臓出血の意)と呼ばれます。ウィキメディア財団の運営するウィキでは、この問題が公にされてから数時間のあいだ、この脆弱性による攻撃が可能な状態にありました。現在のところ、システムあるいは利用者の個人情報が実際に漏洩した形跡は確認されていません。また、財団のサーバーの設定および構成上、この脆弱性を利用して、利用者のウィキ上のパスワードを集めることは困難であるものと考えられます。

財団では、この問題の認識後すぐ、システム上で稼働している問題のソフトウェアを修正しました。その後、利用者との通信に使われていたSSL証明書を変更し、すべての利用者のログイン状態を解除しました。詳細な対応の時系列は後述します。

ログインしているすべての利用者は、サイトに接続するとき、暗号化されたセッション・トークンというものを送ります。Heartbleedという問題を悪用して、そのセッション・トークンが傍受されてしまうと、その利用者になりすますことが可能になってしまいます。ログイン状態を解除すると、セッション・トークンは再設定されることになり、問題の修正されたソフトウェアによって通信することが保証され、リスクを取り除くことができます。

用心のため、パスワードの変更をお勧めいたしますが、全利用者のパスワードを強制的に変更する予定はありません。ウィキメディア財団の利用者がこの脆弱性による攻撃に晒された形跡はありませんが、財団では、利用者のみなさまの安全のために最善を尽くす所存です。

ご理解に感謝いたします。

Greg Grossmeier、ウィキメディア財団コンピューターシステム運営ティーム

ウィキメディア財団による対応の時系列

(時刻はすべてUTCです)

4月7日:

4月8日:

4月9日:

4月10日:

よくある質問

(必要に応じて加筆されます)

  • SSL証明書を変えたのに、なぜ「有効日」は変わっていないのですか?
    財団で利用しているSSL証明書発行所では、証明書を交換した場合でも「有効日」はそのままになっています(よく誤解されますが、これは「発行日」ではありません)。これはあまり一般的なこととは言えません。上記時系列に示した.pemファイルの変更のほかに、証明書の更新が行われたか確認する方法としては、更新の前と後の証明書のフィンガープリントを比較する方法があります。

by Greg Grossmeier at April 10, 2014 04:34 PM

Sage Ross

Remembering Adrianne Wadewitz

Adrianne, skepchickal

Adrianne, skepchickal

I remember, for a long time before I met her, wondering what “a wade wit” meant.

I remember a Skype conversation, years ago. Adrianne, Phoebe, SJ and I talked for probably three hours about the gender gap on Wikipedia, late into the night. Then and always, she was relentlessly thoughtful and incredibly sharp. As superb as she was in writing, she was even better in live conversation and debate.

I remember laughing and talking and laughing and talking at Wikimania 2012. I took this picture of her that she used for a long while as a profile pic. Someone on Facebook said it looked “skepchickal”, which she loved.

I remember her unfailing kindness and generosity, indomitable work ethic, and voracious appetite for knowledge. She made me proud to call myself a fellow Wikipedian.

by Sage Ross at April 10, 2014 02:40 PM

Magnus Manske

Post scriptum

I am running a lot of tools on Labs. As with most software, the majority of feedback I get for those tools falls into one of two categories: bug reports and feature requests, the latter often in the form “can the tool get input from/filter on/output to…”. In many cases, that is quick to implement; others are more tricky. Besides increasing the complexity of tools, and filling up the interface with rarely-used buttons and input fields, the combinations (“…as you did in that other tool…”) would eventually exceed my coding bandwidth. And with “eventually”, I mean some time ago.

Wouldn’t it be better if users could “connect” tools on their own? Take the output of tool X and use it as the input of tool Y? About two years ago, I tried to let users pipeline some tools on their own; the uptake, however, was rather underwhelming, which might have been due to the early stage of this “meta-tool”, and its somewhat limited flexibility.

A script and its output

A script and its output.

So today, I present a new approach to the issue: scripting! Using toolscript, users can now take results from other tools such as category intersection and Wikidata Query, filter and combine the results, and display the results or even use tools like WiDaR to perform on-wiki actions. Many of these actions come “packaged” with this new tool, and the user has almost unlimited flexibility in operating on the data. This flexibility, however, is bought by the scary word programming (an euphemism for “scripting”). In essence, the tool runs JavaScript code that the user types or pastes into a text box.

Still here? Good! Because, first, there are some examples you can copy, run, and play with; if people can learn MediaWiki markup this way, JavaScript should pose little challenge. Second, I am working on a built-in script storage, which should add many more example scripts, ready to run (in the meantime, I recommend a wiki or pastebin). Third, all build-in functions use synchronous data access (no callbacks!), which makes JavaScript a lot more … scriptable, as in “logical linear flow”.

The basic approach is to generate one or more page lists (on a single Wikimedia project), and then operate on those. One can merge lists, filter them, “flip” from Wikipedia to associated Wikidata items and back, etc. Consider this script, which I wrote for my dutiful beta tester Gerard:

all_items = ts.getNewList('','wikidata');
cat = ts.getNewList('it','wikipedia').addPage('Category:Morti nel 2014') ;
cat_item = cat.getWikidataItems().loadWikidataInfo();
$.each ( cat_item.pages[0].wd.sitelinks , function ( site , sitelink ) {
  var s = ts.getNewList(site).addPage(sitelink.title);
  if ( s.pages[0].page_namespace != 14 ) return ;
  var tree = ts.categorytree({language:s.language,project:s.project,root:s.pages[0].page_title,redirects:'none'}) ;
  var items = tree.getWikidataItems().hasProperty("P570",false);
  all_items = all_items.join(items);
} )
all_items.show();

This short script will display a list of all Wikidata items that are in a “died 2014″ category tree on any Wikipedia, that do not have a death date yet. The steps are as follows:

  • Takes the “Category:Morti nel 2014″ from it.wikipedia
  • Finds the associated Wikidata item
  • Gets the item data for that item
  • For all of the site links into different projects on this item:
    • Checks if the link is a category
    • Gets the pages in the category tree for that category, on that site
    • Gets the associated Wikidata items for those pages
    • Removes those items that already have a death date
    • Adds the ones without a death date to a “collection list”
  • Finally, displays that list of Wikidata items with missing death dates

Thus, with a handful of straightforward functions (like “get Wikidata items for these pages”), one can ask complex questions of Wikimedia sites. A slight modification could, for example, create Wikidata items the pages in these categories. All functions are documented in the tool. Many more can be added on request; and, as with adding Wikidata labels, a single added function can enable many more use-cases.

I hope that this tool can become a hub for users who want more than the “simple” tools, to answer complex questions, or automate tedious actions.

by Magnus at April 10, 2014 01:44 PM

Wikimedia Tech Blog

MediaWiki localization file format changed from PHP to JSON

Translations of MediaWiki’s user interface are now stored in a new file format—JSON. This change won’t have a direct effect on readers and editors of Wikimedia projects, but it makes MediaWiki more robust and open to change and reuse.

MediaWiki is one of the most internationalized open source projects. MediaWiki localization includes translating over 3,000 messages (interface strings) for MediaWiki core and an additional 20,000 messages for MediaWiki extensions and related mobile applications.

User interface messages originally in English and their translations have been historically stored in PHP files along with MediaWiki code. New messages and documentation were added in English and these messages were translated on translatewiki.net to over 300 languages. These translations were then pulled from MediaWiki websites using LocalisationUpdate, an extension MediaWiki sites use to receive translation updates.

So why change the file format?

The motivation to change the file format was driven by the need to provide more security, reduce localization file sizes and support interoperability.

Security: PHP files are executable code, so the risk of malicious code being injected is significant. In contrast, JSON files are only data which minimizes this risk.

Reducing file size: Some of the larger extensions have had multi-megabyte data files. Editing those files was becoming a management nightmare for developers, so these were reduced to one file per language instead of storing all languages in large sized files.

Interoperability: The new format increases interoperability by allowing features like VisualEditor and Universal Language Selector to be decoupled from MediaWiki because it allows using JSON formats without MediaWiki. This was earlier demonstrated for the jquery.18n library. This library, developed by Wikimedia’s Language Engineering team in 2012, had internationalization features that are very similar to what MediaWiki offers, but it was written fully in JavaScript, and stored messages and message translations using JSON format. With LocalisationUpdate’s modernization, MediaWiki localization files are now compatible with those used by jquery.i18n.

An RFC on this topic was compiled and accepted by the developer community. In late 2013, developers from the Language Engineering and VisualEditor teams at Wikimedia collaborated to figure out how MediaWiki could best be able to process messages from JSON files. They wrote a script for converting PHP to JSON, made sure that MediaWiki’s localization cache worked with JSON, updated the LocalisationUpdate extension for JSON support.

Siebrand Mazeland converted all the extensions to the new format. This project was completed in early April 2014, when MediaWiki core switched over to processing JSON, creating the largest MediaWiki patch ever in terms of lines of code. The localization formats are documented in mediawiki.org, and MediaWiki’s general localization guidelines have been updated as well.

As a side effect, code analyzers like Ohloh no longer report skewed numbers for lines of PHP code, making metrics like comment ratio comparable with other projects.

Work is in progress on migrating other localized strings, such as namespace names and MediaWiki magic words. These will be addressed in a future RFC.

This migration project exemplifies collaboration at its best between many MediaWiki engineers contributing to this project. I would like to specially mention Adam Wight, Antoine Musso, David Chan, Ed Sanders, Federico Leva, James Forrester, Jon Robson, Kartik Mistry, Niklas Laxström, Raimond Spekking, Roan Kattouw, Rob Moen, Sam Reed, Santhosh Thottingal, Siebrand Mazeland and Timo Tijhof.

Amir Aharoni, Interim PO and Software Engineer, Wikimedia Language Engineering Team

by Amir E. Aharoni at April 10, 2014 11:55 AM

Not Confusing (Max Klein)

The Topmost Cited DOIs on Wikipedia

You’re surfing a topic of great interest to you on Wikipedia, so interesting that you actually click through to the references. You’re excited to read the original material, but all of a sudden you are foiled—you’ve hit a paywall! And $35 to read an article is just too steep.

This image of Xanthichthys ringens is sourced from an open-access scholarly article licensed for re-use. How can we make that reusability explicit when citing this source in Wikipedia articles? For further details, see this Signpost op-ed by Daniel Mietchen.

The Wikipedia Open Access Signalling Project, which I’ve recently joined, sees this as a fantastic opportunity to spread the word about the Open Access (OA) movement. We are still in the initial stages of understanding what OA materials are currently cited on-wiki. One of the ways OA has been cited on Wikipedia so far is through the Template:Cite DOI. A Digital Object Identifier (DOI) can be thought of as ISBNs for articles – academic or otherwise – in the networked world.

The DOI becomes useful in citations because, like with any identifier, one can unmistakably and machine-readably know what is being cited. For the OA Signalling Project machine-readability is key. It will allow us to tell readers whether a cited resource is OA or free-to-read before they even click on it. By analysing the usage of DOIs on Wikipedia we can see what areas are starting to catch on to this slick method of citation.

Our method comes from processing the English Wikipedia database dumps from March 2014.  Hattip going to the wonderful python tools written by Wikimedia Foundation Analytics member Aaron Halfaker. Note this analysis is strictly looking for the Cite doi template, so it doesn’t take into account DOIs that are mentioned in the Cite journal template.

Let’s start off with a surface inquiry – what are the top most used DOIs by number of Wikipedia pages that cite them? Below are the top 10.

 DOI Article Title Times Cited
10.1128/MCB.22.19.6663-6668.2002 Purified Box C/D snoRNPs Are Able To Reproduce Site-Specific 2′-O-Methylation of Target RNA In Vitro 171
10.1093/icb/icr006 Evolution and Ecology of Directed Aerial Descent in Arboreal Ants 132
10.1093/nar/gkj002 A novel experimental approach for systematic identification of box H/ACA snoRNAs from eukaryotes 75
10.1128/MCB.24.13.5797-5807.2004 Human Box H/ACA Pseudouridylation Guide RNA Machinery 66
10.1088.2F0004-6256.2F141.2F5.2F170 Rotational properties of Jupiter Trojans I. Light curves of 80 objects. 64
10.1093/emboj/20.11.2943 Cajal body‐specific small nuclear RNAs: a novel class of 2′‐O‐methylation and pseudouridylation guide RNAs 50
10.1088/0004-637X/753/2/156 Parallaxes and proper motions of ultracool brown dwarfs of spectral types Y and late T. 38
10.1088/0067-0049/197/2/19 The first hundred brown dwarfs discovered by the wide-field infrared survey explorer.` 33
10.3897/zookeys.242.3856 A new species of Schrankia Hübner, 1825 from China (Lepidoptera, Erebidae, Hypenodinae) 28
10.1073/pnas.242603899 Generation and initial analysis of more than 15,000 full-length human and mouse cDNA sequences 27

From an individual article point of view, we can see that Biology performs very well with many articles about molecular and cell biology being widely cited. Also in 2nd and 9th place a study of ants and moths get their due. Astronomy also seems to find success with 3 positions in the top 10.  However if one looks at the articles in which these citations occur, they appear to be mostly stub pages about RNA variations, or outlying stars. We know that each citation is not equal however because those citations on pages that receive more views will gain more exposure. We adapt our method to sum of the page views of each page on which the DOI appears.  To do this we used Wiki View Stats which is quickly becoming a valid alternative to the under-strain stats.grok.se . This top 10 by viewership tells a very different story.

 DOI Article Title Views in March ’14
10.1038/460787a Climate data spat intensifies 2,767,151
10.1038/462545a Climatologist under pressure 2,766,979
10.1353/cwh.1969.0065 Twelve Years a Slave (review) 1,382,286
10.1145/1284621.1284635 Why you can’t cite Wikipedia in my class 1,313,264
10.1371/journal.pone.0028705 Nedoceratops’: An Example of a Transitional Morphology 1,216,279
10.1126/science.1173983 Recent Warming Reverses Long-Term Arctic Cooling 704,326
10.1038/nature06949 High-resolution carbon dioxide concentration record 650,000-800,000 years before present 646,251
10.1073/pnas.0805721105 Proxy-based reconstructions of hemispheric and global surface temperature variations over the past two millennia 646,251
10.1080/10807030802387556  A Comparative Analysis of Accident Risks in Fossil, Hydro, and Nuclear Energy Chains 624,108
10.1098/rsbm.1955.0005  Albert Einstein. 1879-1955 544,353

Firstly some of these citations are receiving a large audience, the topmost gets 2.75 million views in March 2014. True, the article is not being read directly, but some of its content is being transmitted with each read of the page. Also the over all feeling of these citations is different. Somehow they are less rigorously academic, and on more controversial topics. Clearly climate change features heavily, which is both a testament to the public wanting to read about climate change topics, and also the desire of Wikipedians to cite those pages thoroughly. This new perspective is essentially a new altmetric.

While we are investigating individual DOIs, we can use the fact that a DOI has a prefix-suffix composition. The prefix indicates the “registrant” (usually a publisher),  and the suffix what item it is of that publisher. What are the most used DOI prefixes? Below are the top 10.

 Prefix Name DOIs in Use
10.1016 Elsevier 3398
10.1038 Nature Publishing Group 1879
10.1007 Springer-Verlag 1793
10.1098 The Royal Society 1716
10.1111 Wiley Blackwell (Blackwell Publishing) 1350
10.1093 Oxford University Press 1203
10.1002 Wiley Blackwell (John Wiley & Sons) 960
10.1021 American Chemical Society 873
10.1126 American Association for the Advancement of Science (AAAS) 821
10.1080 Informa UK (Taylor & Francis) 704

With Elsevier dominating the top spot, it shows there is still a lot of growth for OA citations on Wikipedia. Also there are other registrants here which are do not support OA policies. If you know more about the OA-ness of each of these top 10, we would like to hear.  Anecdotally, If we take information about Journal citations that do not use DOIs but rather use Template:Cite Journal then we find that this isn’t an unusual distribution, as the top cited Journals from fuzzy of Template:Cite Journal have a lot of overlap.

Some of this analysis could already be performed by “Cite-o-meter”, an existing service answering queries at http://toolserver.org/~dartar/cite-o-meter/ developed in 2011. However cite-o-meter has not seen development in a few years, yet it acutally allows us to think bigger about theses statistics. It would be a worthwhile undertaking to merge or add to cite-o-meter the altmetric of citation strength by Wikipedia pageviews. In fact such a service could include publisher strength by Wikipedia pageviews, as well as Researcher strength by using DOI associated OrcIDs.

Lastly we can flip our previous question, trying to determine which pages use the most DOIs?

Page Name Times Cited
Induced stem cells 189
List of Ig Nobel Prize winners 91
Asymmetric hydrogenation 86
Spinal muscular atrophy 84
Crystallographic defects in diamond 80
Fluorine 78
Fullerene chemistry 54
Choosing Wisely 50
Woolly mammoth 47
Health effects of tobacco 43

The list is as you might expect is power-law distributed. In fact 60.6% of pages that have any DOI citation, have only one. But among these top citing pages we see a large amount of science articles, some of which are quite obscure and recondite. However one article that is promising to the OA signalling cause is “Health effects of tobacco”, because it appears on the WikiProject:Medicine top-importance list. And one of the ways that the OA signalling project plans to prove its worth is by focusing foremost on those medical articles that the web reads for truly important information.

Our next tasks is to start writing a bot that will be able to automatically determine the OA-ness of a citation using some new APIs that have been released by publishers. At that point we will be able to know what percentage of citations on Wikipedia can actually be read by general public.

To learn more about the OA signalling project you can follow our progress on Github. For code to this anlysis, check out the ipython notebooks.

by max at April 10, 2014 12:14 AM

April 09, 2014

Wikimedia Foundation

Europeana Fashion Handbook to Bring Wiki and GLAMs Together

In an effort to improve fashion knowledge on the web, Europeana Fashion has organized a series of edit-a-thons with Wikimedia volunteers and fashion institutions around Europe. The experience and knowledge gained from these events are now compiled in one handbook, The Europeana Fashion Edit-a-thon Handbook for GLAMs.

Fashion Edit-a-thon Logo.png

What is fashion? Fashion is vanity, fashion is business, fashion is art. Fashion can mean many things to many people, but what is certain, is that it has enormous cultural significance. Every item of clothing has its roots in history and carries a symbolic meaning in the present.

2013-05-13 Europeana Fashion Editathon, Centraal Museum Utrecht 39.jpg

An edit-a-thon around fashion in collaboration with Wikimedia Netherlands and Fashion Muse. May 13, 2013. 

Take, for example, the most basic of garments, the T-shirt. It was originally designed as an undergarment in the American army in the early 20th century. In the 1950s it became part of the uniform of rebellious youth culture and was seen on the likes of Marlon Brando and James Dean. Nowadays, the T-shirt is worn everywhere with everything, even under a suit. From underwear, to act of rebellion to formal, fashion objects can be considered artifacts of past and present.

That is why there are public and private institutions collecting fashion. Europeana Fashion aims to bring all these collections together in one online portal and improve knowledge around these collections.

The best way to improve knowledge online is through Wikipedia. It’s open, free and one of the most visited websites. In an effort to get communities and institutions involved, Europeana Fashion hosted multiple Wiki edit-a-thons.

Badge Fashion Editathon.jpg

Fashion badge Edit-a-thon Europeana. Museum of Decorative Arts (Paris), March 22, 2014. 

After setting up seven edit-a-thons in five countries in one year’s time, the project bundled its experiences in a handbook for organizing fashion edit-a-thons. It is directed towards galleries, libraries, archives and museums, or in short: GLAMs. The handbook is available online and open to improvement from the community.

Engaging Fashionistas

Fashion carries with it very relevant cultural, historical and symbolic meaning. However, despite its social significance, fashion’s presence on Wikipedia is not as comprehensive as it should be. This encouraged Europeana Fashion to partner with Wikimedia volunteers in an effort increase fashion knowledge and open multimedia in the Wiki world.

Twenty-two partners from twelve European countries work together on the Europeana Fashion portal. Together, these institutions collect and make available thousands of historical dresses, accessories, photographs, posters, drawings, sketches, videos and fashion catalogues. At the same time, it makes these items findable through Europe’s online cultural hub Europeana. Europeana Fashion invited its partners to make available their collections on Wikimedia Commons and welcomed users to write about their collections. The aim: to enrich and share the knowledge about these objects and improve the existing knowledge about fashion’s history, origins and trajectory on Wikipedia.

A Handbook for GLAMs

The Fashion Edit-a-thon Handbook for GLAMs is a best-practice work, written by members of the Europeana Fashion team and composed of knowledge gained after organizing seven fashion edit-a-thons in Sweden, the Netherlands, Italy, Belgium and Israel.

The handbook primarily compiles what we have learned from running these events. It has been reviewed and amended by the Europeana network and Wikimedia community. It provides an overview of Wikimedia, Wikipedia, Wikimedians, the basics of hosting an edit-a-thon, ways to ensure a successful edit-a-thon, how to measure success, tips for getting content on Wikimedia, event promotion, as well as a suggested day programme, a 3 month preparation agenda and an abundance of relevant links. The final result is a reference guide that any GLAM institution hoping to hold an edit-a-thon can utilize.

2013-05-13 Europeana Fashion Editathon, Centraal Museum Utrecht 36.jpg

Edit-a-thon. Monday, May 13th, 2013. 

Promoting “Open”

Europeana Fashion not only wants to improve knowledge around fashion, it also wants to set an example and influence cultural heritage institutions to open up their collections. Improving cultural heritage knowledge on Wikipedia is an incredibly important aim.

We hope to close the gap between institutions and volunteers in their willingness to learn and change. This handbook, while not exhaustive, is a step towards promoting more collaboration between Wikimedia and fashion institutions (or any GLAM, for that matter).

In an effort to keep the collaboration on-going and beneficial to all we have written in a few questions at the end of the handbook. These are to be answered by any institution who uses it for an edit-a-thon so that everyone can continue to learn from one another. This handbook hopes to become an intermediary to help with GLAM-wiki relationships and lead to more open collaborations.

You can download the Fashion Edit-a-thon handbook here.

Gregory Markus, Project Assistant Europeana Fashion at Netherlands Institute for Sound and Vision

by Gregory Markus at April 09, 2014 07:13 PM

Gerard Meijssen

#Quality - the dead as we know them

Both the German and the English Wikipedia have a category that includes everyone that died in a particular year. Currently they include 724 and 1530 people. Obviously people died that do not have an article in either Wikipedia.

All the dead registered in Wikidata for 2014 account for 1724 people. It demonstrates some quality because the death of more notable people is known than in either or both Wikipedias.

There are some Chinese and some Russian people among them. Some people from Spain, Chile and Argentina. From India, the Cook Islands and Sweden.  Many notable people from these and other countries are missing.

When we do, they will show up on the date of their death. When you look at today, April 9 2014, there are currently two people known to have died. One is from Serbia, the other is from Trinidad and Tobago. This number will grow. Showing it in the Reasonator is already a good thing. A next step would be to create a tool that will inform all projects about the people that have died in the last year.

Quality is measured by comparison, it becomes valuable when the information can be acted upon.
Thanks,
     GerardM

by Gerard Meijssen (noreply@blogger.com) at April 09, 2014 04:20 PM

April 08, 2014

Wikimedia Foundation

Odisha Dibasa 2014: 14 books re-released under CC license

This post is available in 2 languages:
English  • Odia

 Guests releasing a kit DVD containing Odia typeface “Odia OT Jagannatha,” offline input tool “TypeOdia,” Odia language dictionaries, open source softwares, offline Odia Wikipedia and Ubuntu package.

Odisha became a separate state in British India on April 1, 1936. Odia, a 2,500 year old language recently gained the status of an Indian classical language. The Odia Wikimedia community celebrated these two occasions on March 29 in Bhubaneswar with a gathering of 70 people. Linguists, scholars and journalists discussed the state of the Odia language in the digital era, initiatives for its development and steps that can be taken to increase accessibility to books and other educational resources. 14 copyrighted books have been re-licensed under the Creative Commons license and the digitization project on Odia WikiSource was formally initiated by an indigenous educational institute, the Kalinga Institute of Social Sciences (KISS). Professor Udayanath Sahu from Utkal University, The Odisha Review’s editor Dr. Lenin Mohanty, Odisha Bhaskar’s editor Pradosh Pattnaik, Odia language researcher Subrat Prusty, Dr. Madan Mohan Sahu, Allhadmohini Mohanty, Chairman Manik-Biswanath Smrutinyasa and trust’s secretary Brajamohan Patnaik along with senior members Sarojkanta Choudhury and Shisira Ranjan Dash spoke at the event.

 Group photo of Odia wikimedians participating in the advanced Wikimedia workshop at KIIT University.

Eleven books from Odia writer Dr. Jagannath Mohanty were re-released under Creative Commons Share-Alike (CC-BY-SA 3.0) license by the “Manik-Biswanath Smrutinyasa” trust,  a trust founded by Dr. Mohanty for the development of the Odia language. Allhadmohini Mohanty formally gave written permission to Odia Wikimedia to release and digitize these books.

The community will be training students and a group of six faculty members at KISS who will coordinate the digitization of these books. “Collaborative efforts and open access to knowledge repositories will enrich our language and culture,” said linguist Padmashree Dr. Debiprasanna Pattanayak as he inagurated the event. Dr. Pattanayak and Odia language researcher Subrat Prusty from the Institute of Odia Studies and Research also re-licensed three books (Two Odia books; “Bhasa O Jatiyata“, “Jati, Jagruti O Pragati” and an English book “Classical Odia”) based on their research on Odia language and cultural influence of the language on other societies under the same license. KISS is going to digitize some of these books and make them available on Odia Wikisource.

An OpenType Odia Unicode font, “Odia OT Jagannatha” designed by Sujata Patel from Odialanguage.com was released under the OFL license. This is the first Odia OpenType font that the community actively tested. A new Odia offline input tool called “TypeOdia” by Wikipedian Manoj Sahukar was also released for public distribution. DVDs containing the font, the input tool, Odia language dictionaries, offline Odia Wikipedia in Kiwix, Wikipedia editing guide, ISCII to Unicode font converter, various free and open source software packages and Ubuntu operating system.

Active Odia Wikipedian and Admin Mrutyunjaya Kar gave the inaugural speech. Subhashish Panigrahi from the Center for Internet and Society read the annual report and vision of Odia Wikipedia. Chief guest Dr. Debiprasanna Pattanayak discussed about the efforts put forth that brought the Odia language as the sixth Indian classical language. A large majority of Odia publications are not available on the internet and readers are devoid of easy accessibility. He further discussed the process of digitization for preserving valuable books that are out of print and the old palm leaf manuscripts. Professor Udayanath Sahu presented on the process, progress and implementation of machine translation project in Utkal University.

Odia wikipedian mrutyunjaya kar and ansuman giri on 30the march 2014.jpgAdvanced Wikimedia workshop at KIIT University.

Experienced Wikimedians conducted an advanced Wikipedia workshop on the second day of event at KIIT University, Bhubaneswar. It was attended by a majority of the existing Wikimedians from the community including new Wikipedians who signed up for the Odia Wikipedia Education Program at the Indian Institute of Mass Communication, Dhenkanal. Mrutyunjaya Kar presented on WikiData and various tools for linking and accessing information in multiple languages on various Wikimedia projects. Ansuman Giri discussed advanced technical aspects such as the use of various gadgets, proper categorization, how to use subpages, how to auto-list archive pages, customizing WikiLove feature, user rights modification, including how important it is to cite biographies of living persons with secondary sources, etc. Shitikantha Dash discussed copyright and issues regarding uploading images and other media files on Wikimedia Commons. Dr. Subas Chandra Rout presented on “notability, referencing and creating citations for the notable topics.”  Subhashish Panigrahi discussed the work plan for the year, failure of program projects, collective learning and the dos and don’ts of community building.

We hope that more authors will come forward and re-release their books under CC-BY-SA license. The Odia community is excited to see or.wikisource.org go live. A few Wikipedians are even interested in typing their favorite free licensed books to make them available on Wikisource. I believe it’ll be challenging to train the KISS students to type and proof-read the written texts. In the CISA2K’s draft plan, the goal to have the number of editors seems overestimated. The students need to have some knowledge about Wikimedia and how it works in general before they start working. We hope that the books will be digitized properly and in coming days more users will join us in the process as we will have more free books in Odia Wikisource. I appeal to the Odia people to be a part of the Odia Wikimedia community and make Odia Wikisource a successful project, we need all the time you can devote. :-)

Ansuman Giri, Odia Wikipedian

 

Subhashish Panigrahi, Odia Wikipedian

Odia

ଓଡ଼ିଶା ଦିବସ ୨୦୧୪: କ୍ରିଏଟିଭ କମନ୍ସ ଲାଇସେନ୍ସରେ ୧୪ ଖଣ୍ଡ ବହିର ବିତରଣ, ଉଇକିପାଠାଗାର ପାଇଁ କାମ ଆରମ୍ଭ

୧୯୩୬ ମସିହା ଅପ୍ରେଲ ୧ ତାରିଖରେ ଇଂରେଜ ଅଧିକୃତ ଭାରତବର୍ଷରେ ଓଡ଼ିଶା, ଭାଷା ଭିତ୍ତିରେ ଏକ ସ୍ଵତନ୍ତ୍ରରାଜ୍ୟ ଭାବେ ଜନ୍ମଲାଭ କଲା ପରେ ପରେ ୨୫୦୦ ବର୍ଷର ସମୃଦ୍ଧ ଇତିହାସ ଓ ପ୍ରମାଣକୁ ଭିତ୍ତି କରି ଏହିବର୍ଷ ଫେବୃଆରୀ ମାସ ୨୦ ତାରିଖ ଅନ୍ତର୍ଜାତିକ ମାତୃଭାଷା ଦିବସର ଅବ୍ୟବହିତ ପୂର୍ବରୁ ଓଡ଼ିଆ ଭାଷା “ଶାସ୍ତ୍ରୀୟ” ମାନ୍ୟତା ଭାବେ ଓଡ଼ିଆ ସ୍ଵୀକୃତ ହେଲା । ଏହାକୁ ପାଳନ କରିବା ପାଇଁ ଓଡ଼ିଆ ଉଇକିମିଡ଼ିଆ ମାର୍ଚ୍ଚ ୨୯ ୨୦୧୪ରେ “ଓଡିଶା ଦିବସ” ସମାରୋହର ଆୟୋଜନ କରାଯାଇଥିଲା । ପ୍ରାୟ ୭୦ ଜଣ ବ୍ୟକ୍ତି ଏଥିରେ ଯୋଗଦେଇ ଓଡ଼ିଆ ଭାଷା ସମ୍ବନ୍ଧିତ ଆଲୋଚନାରେ ଭାଗ ନେଇଥିଲେ । ଅନେକ ଜଣାଶୁଣା ଭାଷାବିଦ, ସାମ୍ବାଦିକ ଓ ଗବେଷକଗଣ ଏଥିରେ ଯୋଗଦାନ କରି ଡିଜିଟାଲ ମାଧ୍ୟମରେ ଓଡ଼ିଆ ଭାଷାର ସ୍ଥିତି, ପ୍ରଗତି ଓ ଆଗକୁ ନିଆଯାଉଥିବା ପଦକ୍ଷେପ ଉପରେ ଆଲୋଚନା କରିବା ସହିତ ଅଧିକରୁ ଅଧିକ ଓଡ଼ିଆ ବହି କିଭଳି ଡିଜିଟାଲ ମାଧ୍ୟମରେ ବିଶ୍ଵର କୋଣ ଅନୁକୋଣରେ ରହୁଥିବା ଓଡ଼ିଆ ମାନଙ୍କ ପାଖରେ ପହଞ୍ଚିପାରିବ ସେ ଉପରେ ମତାମତ ଦେଇଥିଲେ । ଏହି କାର୍ଯ୍ୟକ୍ରମରେ ଓଡ଼ିଆ ଉଇକିପାଠାଗାର ଡିଜିଟାଇଜେସନ ପ୍ରକଳ୍ପର ଶୁଭାରମ୍ଭ କରି କ୍ରିଏଟିଭ କମନ ଲାଇସେନ୍ସ ନିୟମ ଅଧୀନରେ ୧୪ଟି ବହି ବିତରଣ କରଯାଇଥିଲା । ଏହି ପ୍ରକଳ୍ପକୁ ରାଜ୍ୟର ଅଗ୍ରଣୀ ଶିକ୍ଷାନୁଷ୍ଠାନ କଳିଙ୍ଗ ଇନଷ୍ଟିଚ୍ୟୁଟ ଅଫ ସୋସିଆଲ ସାଇନ୍ସେସ (କିସ) ଔପଚାରିକ ଭାବେ ଆରମ୍ଭ କରୁଛି । ଓଡିଆ ଉଇକିମିଡିଆ ଦ୍ଵାରା ଏକ ଡିଭିଡି ମଧ୍ୟ ଉନ୍ମୋଚିତ ହେଇଥିଲା । ଉତ୍କଳ ବିଶ୍ଵବିଦ୍ୟାଳୟର ପ୍ରଫେସର ଡ. ଉଦୟ ନାଥ ସାହୁ, ମାସିକ ଇଂରାଜୀ ପତ୍ରିକା ଓଡିଶାରିଭ୍ୟୁର ସମ୍ପାଦକ ଡ. ଲେନିନ ମହାନ୍ତି, ଓଡ଼ିଶା ଭାସ୍କରର ସମ୍ପାଦକ ପ୍ରଦୋଶ ପଟ୍ଟନାୟକ, ଓଡ଼ିଆ ଭାଷା ଗବେଷକ ସୁବ୍ରତ ପୃଷ୍ଟି, କିସ ଅନୁଷ୍ଠାନରୁ ଡ. ମଦନ ମୋହନ ସାହୁ, ମାଣିକ-ବିଶ୍ଵନାଥ ସ୍ମୃତିନ୍ୟାସର ସଭାପତି ଶ୍ରୀମତୀ ଆହ୍ଲାଦମୋହିନୀ ମହାନ୍ତି, ସଭାପତି ଶ୍ରୀ ବ୍ରଜମୋହନ ପଟ୍ଟନାୟକ ଓ ଅନ୍ୟ ବୟୋଜ୍ୟେଷ୍ଠ ସଦସ୍ୟ ସରୋଜକାନ୍ତ ଚୌଧୁରୀ ଓ ଶିଶିରରଞ୍ଜନ ଦାଶ ନିଜର ମୂଲ୍ୟବାନ ବକ୍ତବ୍ୟ ରଖିଥିଲେ ।

କିଟ ବିଶ୍ୱବିଦ୍ୟାଳୟଠାରେ ଅନୁଷ୍ଠିତ ଉଚ୍ଚତର ଓଡ଼ିଆ ଉଇକିପିଡ଼ିଆ କର୍ମଶାଳାରେ ଭାଗନେଇଥିବା ଓଡ଼ିଆ ଉଇକିଆଳିଗଣ

ଓଡିଆ ଲେଖକ ଡ. ଜଗନ୍ନାଥ ମହାନ୍ତିଙ୍କର ୧୧ଟି ପୁସ୍ତକ କ୍ରିଏଟିଭ କମନ୍ସ ସେୟାର ଏଲାଇକ (CC-BY-SA 3.0)ର ଆଧାରରେ ଲେଖକଙ୍କ ଦ୍ଵାରା ପ୍ରତିଷ୍ଠିତ ଅନୁଷ୍ଠାନ “ମାଣିକ-ବିଶ୍ଵନାଥ ସ୍ମୃତିନ୍ୟାସ” ଦ୍ଵାରା ଓଡ଼ିଆ ଭାଷାର ବିକାଶ ନିମନ୍ତେ ପୁନଃ-ଉନ୍ମୋଚିତ ହେଇଥିଲା । ସ୍ୱାତ୍ୱାଧିକାରିଣୀ ଆହ୍ଲାଦମୋହିନୀ ମହାନ୍ତିଙ୍କ ଲିଖିତ ଅନୁମତିକ୍ରମେ ଓଡ଼ିଆ ଉଇକିମିଡ଼ିଆ ବହିଗୁଡ଼ିକର ଡିଜିଟାଇଜେସନର ଭାର ହାତକୁ ନେଲା । ଓଡ଼ିଆ ଉଇକିମିଡ଼ିଆ କିସ ଅନୁଷ୍ଠାନର ୬ ଜଣ ଶିକ୍ଷକ ଓ କିଛି ଛାତ୍ରଙ୍କୁ ଡିଜିଟାଇଜେସନ ସମ୍ବନ୍ଧରେ ପୂର୍ଣ୍ଣ ସହଯୋଗ କରିବା ସହ ପ୍ରଶିକ୍ଷଣ ମଧ୍ୟ ଦେବ । “ସାମୁହିକ ପ୍ରୟାସ ଓ ଜ୍ଞାନର ଅବାଧ ଉପଯୋଗ ଆମର ଭାଷା ଓ ସଂସ୍କୃତିକୁ ସମୃଦ୍ଧ କରିବାରେ ସାହାଯ୍ୟ କରିବେ ବୋଲି ଭାଷାବିଦ ପଦ୍ମଶ୍ରୀ ଦେବୀପ୍ରସନ୍ନ ପଟ୍ଟନାୟକ ଏହି କାର୍ଯ୍ୟକ୍ରମ ଅବସରରେ ନିଜର ମତପ୍ରକାଶ କରି କହିଥିଲେ । ଓଡ଼ିଆ ଭାଷା ଗବେଷଣା ପ୍ରତିଷ୍ଠାନ ତରଫରୁ ଓଡ଼ିଆ ଭାଷା ଓ ସାଂସ୍କୃତିକ ଭିତ୍ତିଭୂମିର ପ୍ରଭାବ ଉପରେ ଗବେଷଣା ସମ୍ବଳିତ ତଥ୍ୟକୁ ନେଇ ଡ. ପଟ୍ଟନାୟକ ଓ ଓଡ଼ିଆ ଭାଷା ଗବେଷକ ଶ୍ରୀ ସୁବ୍ରତ ପୃଷ୍ଟିଙ୍କ ରଚିତ ୩ଟି ଗବେଷଣା ପୁସ୍ତକ (ଓଡ଼ିଆ ବହି “ଭାଷା ଓ ଜାତୀୟତା” ଓ “ଜାତି, ଜାଗୃତି ଓ ପ୍ରଗତି” ଏବଂ ଇଂରାଜୀ ବହି “କ୍ଲାସିକାଲ ଓଡ଼ିଆ”) କ୍ରିଏଟିଭ କମନ୍ସ ଲାଇସେନ୍ସରେ ବିତରଣ କରିଥିଲେ । “କିସ” ଅନୁଷ୍ଠାନ ଏଥିମଧ୍ୟରୁ କିଛି ବହି ଡିଜିଟାଇଜ କରିବାକୁ ଯାଉଛି ଯାହାକି ପରେ ଓଡ଼ିଆ ଉଇକିସୋର୍ସରେ ମାଗଣାରେ ଉପଲବ୍ଧ ହେବ ।

ଓଡିଆଲାଙ୍ଗୁଏଜ.କମର ସୁଜାତା ପଟେଲଙ୍କ କୃତ ଓପନ ଟାଇପ ଓଡ଼ିଆ ଇଉନିକୋଡ ଫଣ୍ଟ “ଓଡିଆ ଓ.ଟି ଜଗନ୍ନାଥ” ଓ.ଏଫ.ଏଲ ସ୍ଵତ୍ଵରେ ଉନ୍ମୋଚିତ ହେଇଥିଲା । ଏହା ହେଉଛି ପ୍ରଥମ ଓଡ଼ିଆ ଫଣ୍ଟ ଯାହାକୁ ଓଡ଼ିଆ ଉଇକିପିଡିଆ ସଙ୍ଘର ସକ୍ରିୟ ଯୋଗଦାନ ଦ୍ୱାରା ପ୍ରଥମ କରି ଆତ୍ମପ୍ରକାଶ କରିଛି । ଉଇକିଆଳି ମନୋଜ ସାହୁକାରଙ୍କ ନିର୍ମିତ ଏକ ନୂଆ ଓଡ଼ିଆ ଅଫଲାଇନ ଇନପୁଟ ସାଧନ “ଟାଇପଓଡ଼ିଆ (TypeOdia)” ମଧ୍ୟ ଜନସାଧାରଣଙ୍କ ନିମନ୍ତେ ଉତ୍ସର୍ଗ କରାଯାଇଥିଲା । ଏଥିରେ ଅଂଶଗ୍ରହଣ କରିଥିବା ସମସ୍ତଙ୍କୁ ଡିଭିଡି ମାଧ୍ୟମରେ ଉକ୍ତ ଫଣ୍ଟ, ଇନପୁଟ ସାଧନ, ଓଡ଼ିଆ ଅଭିଧାନ, ମୁକ୍ତ ସଫ୍ଟ ର ଗୁଡିକ, କିଉଇକ୍ସରେ ଅଫଲାଇନ ଓଡ଼ିଆ ଉଇକିପିଡ଼ିଆ, ଉଇକିପିଡ଼ିଆ ସମ୍ପାଦନା ସହାୟକ ପୁସ୍ତିକାଉବୁଣ୍ଟୁ ଅପରେଟିଂ ସିଷ୍ଟମ ବିତରଣ କରଯାଇଥିଲା । ସକ୍ରିୟ ଉଇକିଆଳି ମୃତ୍ୟୁଞ୍ଜୟ କର ଉଦଘାଟନୀ ଅଭିଭାଷଣ ଦେଇଥିଲେ ଓ କାର୍ଯ୍ୟକ୍ରମ ପରିଚାଳନା କରିଥିଲେ । ସେଣ୍ଟର ଫର ଇଣ୍ଟରନେଟ ଏଣ୍ଡ ସୋସାଇଟି ତରଫରୁ ସୁଭାସିସ ପାଣିଗାହି ଓଡ଼ିଆ ଉଇକିପିଡ଼ିଆର ବାର୍ଷିକ ବିବରଣୀ ଓ ଭବିଷ୍ୟତର ଯୋଜନା ପାଠକରିଥିଲେ । ମୁଖ୍ୟ ଅତିଥି ଭାବେ ଭାଷାବିଦ ଡ. ଦେବୀପ୍ରସନ୍ନ ପଟ୍ଟନାୟକ ଓଡ଼ିଆ ଭାଷାର ଶାସ୍ତ୍ରୀୟ ମାନ୍ୟତା ଲାଗି ହେଇଥିବା ଆପ୍ରାଣ ପ୍ରଚେଷ୍ଟା ଓ ଉଦ୍ୟମର କାହାଣୀ ବଖାଣିଥିଲେ । ପାଖପାଖି ସମସ୍ତ ଓଡ଼ିଆ ପ୍ରକାଶନ ଇଣ୍ଟରନେଟରେ ଉପଲବ୍ଧ ନଥିବାରୁ ତାହାସବୁ ସମଗ୍ର ବିଶ୍ୱରେ ଥିବା ଓଡ଼ିଆଙ୍କ ପାଖରେ ପହଞ୍ଚିପାରୁନାହିଁ ବୋଲି ସେ କ୍ଷୋଭ ପ୍ରକାଶ କରିଥିଲେ । ଏଥି ସହିତ ସେ ପୁରୁଣା ବହି ଆଉ ତାଳପତ୍ର ପୋଥି ତଥା ପାଣ୍ଡୁଲିପିଗୁଡ଼ିକୁ ଡିଜିଟାଇଜେସନ ମାଧ୍ୟମରେ ସଂରକ୍ଷିତ କରିବା ପାଇଁ ପରାମର୍ଶ ଦେଇଥିଲେ । ପ୍ରଫେସର ଉଦୟନାଥ ସାହୁ ଉତ୍କଳ ବିଶ୍ୱବିଦ୍ୟାଳୟରେ ଚାଲିଥିବା “ମେସିନ ଟ୍ରାନ୍ସଲେସନ” ପ୍ରକଳ୍ପର ପ୍ରକ୍ରିୟା, ପ୍ରଗତି ଓ କାର୍ଯ୍ୟକାରିତା ଉପରେ ବିବରଣୀ ଦେଇଥିଲେ ।

Odia wikipedian mrutyunjaya kar and ansuman giri on 30the march 2014.jpg

ଅଭିଜ୍ଞ ଉଇକିଆଳିଗଣ ଦ୍ଵିତୀୟ ଦିନ ଭୁବନେଶ୍ଵରସ୍ଥିତ କିଟ ବିଶ୍ୱବିଦ୍ୟାଳୟ ଏକ କର୍ମଶାଳାର ଆୟୋଜନ କରିଥିଲେ ଯେଉଁଥିରେ ଓଡ଼ିଆ ଉଇକିପିଡ଼ିଆ ଶିକ୍ଷା ପ୍ରକଳ୍ପ ଦ୍ଵାରା ଯୋଗଦେଇଥିବା ନୂଆ ଉଇକିଆଳି ଆଇ.ଆଇ.ଏମ.ସି ଢେଙ୍କାନାଳର ଛାତ୍ରଛାତ୍ରୀମାନେ ମଧ୍ୟ ଅଂଶଗ୍ରହଣ କରିଥିଲେ । ଅଭିଜ୍ଞ ଉଇକିଆଳି ମୃତ୍ୟୁଞ୍ଜୟ କର ଉଇକିଡାଟା ଉପରେ ବିବରଣୀ ପ୍ରଦାନ କରିଥିଲେ ଓ ଅନ୍ୟାନ୍ୟ ଭାଷାରେ ଲେଖା ଯାଇଥିବା ଲେଖା ଓ ଉଇକିମିଡ଼ିଆ ପ୍ରକଳ୍ପଗୁଡ଼ିକୁ କିପରି ଓଡ଼ିଆ ଭାଷା ଦ୍ଵାରା ଖୋଜାଯାଇପାରିବ ସେ ଉପରେ ବିସ୍ତ୍ରୁତ ଜ୍ଞାନଦାନ କରିଥିଲେ । ଅନ୍ୟତମ ଅଭିଜ୍ଞ ଉଇକିଆଳି ଓ ପରିଚାଳକ ଅଂଶୁମାନ ଗିରି ଉଇକିପିଡ଼ିଆରେ ବ୍ୟବହୃତ ହେଉଥିବା ନୂଆ ନୂଆ ପ୍ରୟୋଗ ଓ ପ୍ରଦ୍ୟୋଗ, ଗ୍ୟାଜେଟମାନଙ୍କର ବ୍ୟବହାର, ଉପପୃଷ୍ଠାର ବ୍ୟବହାର ପ୍ରଣାଳୀ, ଅଟୋ-ଲିଷ୍ଟ ଅଭିଲେଖ ଓ ଅନ୍ୟ ଉଇକିଆଳିଙ୍କ ସହ ଭାବର ଆଦାନପ୍ରଦାନ ପାଇଁ ବ୍ୟବହୃତ ସାଧନ “ଉଇକିଲଭ”ର ବ୍ୟବହାରିକତା, ବ୍ୟବହାରକାରୀଙ୍କ ଅଧିକାରରେ ବଦଳ ଓ ମାଧ୍ୟମିକ ଆଧାର ବ୍ୟବହାର କରି ଜୀବିତ ବ୍ୟକ୍ତିଙ୍କ ଜୀବନୀ କିପରି ଲେଖାଯିବ ସେ ବିଷୟରେ କହିଥିଲେ । ଉଇକିଆଳି ଶିତିକଣ୍ଠ ଦାଶ କପିରାଇଟ , ଉଇକି କମନ୍ସରେ ଛବି ଓ ଅନ୍ୟାନ୍ୟ ଫାଇଲ ଅପ୍ଲୋଡ କରିବାରେ ଉପୁଜୁଥିବା ସମସ୍ୟା ଉପରେ ଆଲୋକପାତ କରିଥିଲେ । ବରିଷ୍ଠ ଉଇକିଆଳି ଡା. ସୁବାସ ଚନ୍ଦ୍ର ରାଉତ “ଉଇକିପିଡ଼ିଆରେ ପ୍ରସଙ୍ଗଭୁକ୍ତ ହେବା ନିମନ୍ତେ ଯୋଗ୍ୟତା, ଯୋଗ୍ୟ ପ୍ରସଙ୍ଗରେ ଆଧାର ପ୍ରଦାନ” ଉପରେ ଏକ ବିବରଣୀ ଦେଇଥିଲ । ଶେଷରେ ସୁଭାସିସ ପାଣିଗାହି ଚଳିତ ବର୍ଷର କାର୍ଯ୍ୟପନ୍ଥା, ବିଫଳ ହେଇଥିବା କାର୍ଯ୍ୟପ୍ରକଳ୍ପ, ସାମୁହିକ ଶିକ୍ଷା ଓ କମ୍ୟୁନିଟି ଗଠନରେ ନିୟାମକ ଗୁଡ଼ିକ ଉପରେ ଆଲୋଚନା କରିଥିଲେ ।

ଆମେ ଆଶା କରୁଛୁ ଯେ, ଅଧିକରୁ ଅଧିକ ଲେଖକ ନିଜର ବହିମାନ CC-BY-SA ଲାଇସେନ୍ସରେ ବିତରଣ କରିବା ନିମନ୍ତେ ଆଗଭର ହେବେ । ଓଡ଼ିଆ ସମାଜ ଓଡ଼ିଆ ଉଇକିପାଠାଗାର ଆସିବା ପାଇଁ ଖୁସିରେ ଚାହିଁ ବସିଛନ୍ତି । କିଛି ଆଗ୍ରହୀ ଉଇକିଆଳି ନିଜର ପ୍ରିୟ ବହି ଟାଇପ କରି ସେସବୁକୁ ଉଇକିପାଠାଗାରରେ ଉପଲବ୍ଧ କରାଇବା ନିମନ୍ତେ ମଧ୍ୟ ଅନାଇଁ ବସିଛନ୍ତି । ବୋଧହୁଏ କିସଠାରେ ଛାତ୍ରଛାତ୍ରୀଙ୍କୁ ଓଡ଼ିଆରେ ଟାଇପ କରାଇବା ଓ ବନାନ ସୁଧାର ଏକ ବଡ଼ ଆହ୍ୱାନ ହେବ । CISA2Kର ଯୋଜନା ଖସଡ଼ାରେ ସମ୍ପାଦକଙ୍କ ସଂଖ୍ୟା ଅଧିକ ଲାଗୁଛି । ଛାତ୍ରମାନଙ୍କୁ ଉଇକିମିଡ଼ିଆ ବାବଦରେ ଅଧିକ ଶିଖିବାକୁ ପଡ଼ିବ ଓ ଏହା କିପରି ସାଧାରଣଭାବେ କାମ କରେ ତାହା ଜାଣିବାକୁ ପଡ଼ିବ । ଆମେ ଆଶାକରୁ କି ବହିଗୁଡ଼ିକ ଠିକଭାବେ ଡିଜିଟାଇଜ ହୋଇ ଉଇକିପାଠାଗାରକୁ ଅଧିକ ବହିରେ ସମୃଦ୍ଧ କରିବ ତଥା ଆଗାମୀ ଦିନରେ ଅଧିକରୁ ଅଧିକ ସଭ୍ୟ ଯୋଗଦାନ କରିବେ । ଜଗତସାରାର ଓଡ଼ିଆମାନଙ୍କୁ ମୋର ଅନୁରୋଧ, ସେମାନେ ଓଡ଼ିଆ ଉଇକିମିଡ଼ିଆ ସମାଜ ସହ ହାତମିଳାଇ ଓଡ଼ିଆ ଉଇକିପାଠାଗାରକୁ ଏକ ସ୍ଫଳ ପ୍ରକଳ୍ପରେ ପରିଣତ କରିବେ । ଆମେ ଆପଣଙ୍କର ବହୁମୂଲ୍ୟ ସମୟ ଲୋଡ଼ୁଁ! :-)

ଅଂଶୁମାନ ଗିରି, ଓଡ଼ିଆ ଉଇକିଆଳି

by Subhashish Panigrahi at April 08, 2014 10:21 PM

Wikimedia UK

Improving Wikipedia coverage of women artists

The photograph shows three women at a computer screen, having a conversation

Daria Cybulska of Wikimedia UK (centre) speaking with some of the event attendees

This post was written by Althea Greenan of the Women’s Art Library at Goldsmiths College

How did the Wikipedia editathon come about with regards to women artists? There have been a number of editathons that led to the session I held here recently.

I organized a modest follow up (8th March) of a much bigger event (1 Feb 2014) organised with Wikimedia NYC. This major event in the US inspired satellite events elsewhere including an event that took place at Middlesex University. The event I organised for the Women’s Art Library to celebrate International Women’s Day, was not only a follow up to this initiative from the librarians in the US, but is something I’ve been wanting to do ever since I became aware of the Wikimedian community and the GLAM projects that connect with collections in Galleries, Libraries and Museums. I have also been in discussions with artists groups such as conversation to be had from which emerged the awareness that women artists are not represented adequately in Wikipedia. It demonstrates the bias of content resulting from a lack of women writers, scholars and content creators.

I am the curator of the Women’s Art Library which was originally set up in the late 1970s as a slide registry building a centre of documentation and arts activities that raised awareness of women’s art practice. This organisation operated over several decades and the collection, now in Goldsmiths, continues to act as a centre for research and new art projects, and a space for interventions promoting the work of women, such as the Wikipedia workshop. The charity Wikimedia UK provides trainers, volunteers, who demystify, but also set standards on how to contribute good quality articles to Wikipedia, and it seemed like a very obvious thing to set up and see if it flies.

It was a very successful, exciting debate regarding the feminist strategy, born of necessity, that we need to write our own histories, set in the context of a rapidly expanding global resource that is seeking to be inclusive and yet maintain high, impartial standards of knowledge sharing. It is absolutely necessary to take up the challenge this opportunity brings and the important result from my first workshop is that everyone would like to follow it up with more to build on the knowledge and confidence to create records.

Pages are set up in Wikipedia relating to these events that might list records created etc (like this one), but it takes time to generate these, and to track down images that can be licensed to Creative Commons. The fact that an image can only be used if you relinquish aspects of copyright, allowing unrestricted use, can feel like an obstacle to some artists, but museums and others are increasingly putting images online, and allowing photography in public displays that acknowledge a different cultural approach to image-sharing.

In the past the Women’s Art Library  has ‘tackled gender equality in the arts’ through publications, especially a magazine that was distributed globally by the time the funding came to an end in 2002. It is a strategy that creates a context for contemporary and emerging artists to see themselves alongside each other, and historical women artists, and the powerful resonances that perspective gives is something that remains not only in reading back over those articles (an anthology is forthcoming in 2015 from IB Tauris), but also in the articles that now appear in a different publishing setting: the Internet.

The workshop was attended by a multi-generational cross-section of artists, students, lecturers, a trainee archivist and a musician, and all felt welcome into the conversation. I think that’s not only because I invited them to the Women’s Art Library to take a place at the table, but because, yes, I think Wikipedia is a good place to start redressing the balance. There is a very rich world of women’s art practice that we are aware of but which should become part of our shared knowledge too.

by Stevie Benton at April 08, 2014 02:24 PM

Gerard Meijssen

#Wikidata - Nordic Children's Book Prize

The Nordic Children's Book Prize has been awarded to people from Iceland, the Faroe Islands, Denmark, Sweden and Norway.

The article for the prize exists in many languages including English. They all list the winners for the prize and the list is incomplete for all of them but not necessarily in the same way.

Wikidata knows about 23 winners including the 2013 winner. She was known in Wikidata, there is no Wikipedia who identified her as the winner of this prize.

It would be nice when all the winners were known to Wikidata and, when there is an image for all of them.
Thanks,
      GerardM

by Gerard Meijssen (noreply@blogger.com) at April 08, 2014 01:13 PM

#Quality - #Wikipedia vs #Wikidata

Probably the most divisive issue in both Wikipedia and Wikidata is quality. It is because of expectations and insistence on what "everybody" should do.

Both projects are Wikis and as far as I am concerned, the argument was decided when Nupedia got its early grave. The clincher was when research proved that Wikipedia is as good as its competition.

Quality in a Wiki world is comparative and not an absolute. You can compare Wikidata to each Wikipedia and, you can compare Wikidata to all Wikipedias. Wikidata knows about more "items of knowledge" than any Wikipedia. Every Wikipedia includes articles that are not yet represented in Wikidata and when they are, many statements are waiting to be made in Wikidata.

To create an environment where a Wikipedia can use Wikidata for its information, there are a few prerequisites, considerations:
  • at least every article needs to be connected to a Wikidata item
  • all the data needs to be available, preferably in Wikidata only
  • the information should be presentable in the language of the Wikipedia
This all will work when there is one shared understanding, one shared ambition: to share in the sum of all knowledge. It restricts what a community can decide, it directs what best practices are and it defines where tools are needed to support best practices.

Wikidata is a game changer and, we as a community are slow getting to understand its implications. From a development point of view there is an obvious geographical divide. MediaWiki development for the application level does not consider Wikidata. Its architecture ignores it. Wikidata development is first and foremost development for its infrastructure, there is no option but as a consequence the tooling is mostly fragmented. Many in our communities consider Wikidata a service project and, while it is, it is becoming so much more. Catering to ill formed arguments and sentiments of the diverse communities will not serve Wikidata at all, it will not serve those communities either. What works is for everybody to work on the Wikidata data that is important to him or her and appreciate the considerations that make Wikidata the information platform for us all.
Thanks,
      GerardM

by Gerard Meijssen (noreply@blogger.com) at April 08, 2014 05:12 AM

Pete Forsyth, Wiki Strategies

Where is the real report on the Belfer Center’s Wikipedia program?

For those seeking more context for this blog post, I recommend this summary by William Beutler.

The "Lessons Learned" identified in the Wikimedia Foundation's report

The “Lessons Learned” identified in the Wikimedia Foundation’s report

Last week, Sue Gardner of the Wikimedia Foundation (WMF) reported on WMF’s role in placing a paid Wikipedia writer (“Wikipedian in Residence” or “WIR”) at Harvard’s Belfer Center, paid for by the Stanton Foundation. Gardner’s report acknowledges that WMF was negligent; but based on my substantial and direct contact with the program – before, during, and after its execution, as described here – I consider that conclusion insufficient. The WMF knew that its actions were mistakes before it made them – and then it made them anyway.

This program should have been approached in a way that supported the Wikimedia vision, and the ethical principles articulated by Wikipedia’s editorial community (e.g. here and here) from the beginning. Wikimedia personnel knew the importance of this principle, but did not act on it. The problems of the program went far beyond mere negligence. The WMF willfully disregarded best practices for engaging with Wikipedia – which is supposed to be one of its core competencies. I think we deserve a thorough explanation of how and why that happened, and so far, we don’t have one.

Here, I’d like to respond to the first two of the three “lessons learned” identified by Gardner in her report. (I have no direct knowledge relevant to the third lesson.)

Gardner’s Lesson #1:

At the point when it became clear that this project was not a simple pass-through grant but required programmatic work, the Executive Director should have transferred responsibility for it to a programmatic area…

Gardner identifies a point that necessitated a different course of action, but she doesn’t state exactly when that point was.

Let me fill in that gap: that point was at the very beginning – that is, before Gardner, as Executive Director, assigned responsibility to the Revenue department.

That much should be clear to those readers who have a general familiarity with WMF and its approach to fund-raising and programs. No pass-through grant, to fund a kind of program (WIR) that WMF has consistently refused to directly fund since at least 2010, should ever have been viewed as “simple,” or appropriate for execution by fund-raising staff.

The notion that this project had programmatic significance was made especially clear, though, in fall 2011, when Gardner first introduced me to Lisa Gruwell, then WMF’s newly hired Deputy Chief Revenue Officer. Gardner outlined my professional background designing similar programs, and said to Gruwell:

You should talk to Pete about the Wikipedian in Residence position the Stanton Foundation wants us to put together.

In short, the programmatic significance of the project was understood by Gardner and Gruwell many months before the job description was posted; it was not a revelation that came through part-way through the project.

Gardner’s Lesson #2:

…when the the Stanton Foundation made it clear that it expected article editing to be a core job responsibility, the WMF acceded to that request, replacing the job description with a new version provided by the Stanton Foundation and the Belfer Center. The WMF didn’t give that new version enough scrutiny before agreeing to it, and didn’t inform the people who’d been advising us. This was a mistake.

What Gardner says in Lesson 2 is factually correct, but a reader might get the impression that the mistake was a mere oversight – that a couple of stray comments from volunteers, perhaps, got lost in the shuffle.

Such an impression would be wildly inaccurate. The guidance that I, Liam Wyatt, Lori Byrd-Phillips, and Wikimedia staff provided was thorough, and it was clearly communicated through proper channels.

Here are a few notes about the discussion I initiated, in which Gruwell and several of her WMF colleagues actively engaged, as we explored the substantial problems with the program and the job description that was being posted:

  • Dates of discussion: April 16 to April 27, 2012
  • 25 email messages
  • Active commenting and editing of the job description outside of email
  • In the early stages of that discussion (April 16), Gruwell stated:

I completely understand your concerns about paid editing and the [sic] is not an attempt at it.

Now, we learn from Gardner’s report that Stanton and Belfer later insisted on explicitly restoring the paid editing component of the program. Gruwell did not run that change by Wyatt, myself, or (apparently) anybody with relevant qualifications.

Because of the extensive discussion devoted to precisely this point, I do not believe this could have been a simple oversight. It had to be a considered decision to keep that information from us. I do not have any further insights into why that decision was made; but referring to that decision as a “mistake” in the report fails to capture the nature of the decision in a credible way.

Conclusions

I believe it is too early to draw conclusions about the Belfer Center’s Wikipedian in Residence program, or the Wikimedia Foundation’s role in it. WMF personnel did the wrong things, and the organization has conceded that much. Good. But they also did so with a level of disregard for the consequences that, to me, appears reckless. Reckless disregard for the future of Wikipedia, this thing we have all worked so hard to build, this thing we all believe in so passionately.

And we haven’t yet heard why.

Some have speculated that there was a deliberate attempt to subvert Wikipedia’s ethical principles as a special favor to an important funder.

I am highly confident that is not the case, based on my personal experience with the various people involved. Not just based on my personal experiences, though – I also believe the relative scale of the program belies the “corruption” story. If a foundation with assets in the hundreds of millions were to do something intentionally subversive, it would do so at a scale far greater than $50,000; and it would do more than add a few hundred words to each of a dozen Wikipedia articles. But in the absence of a thorough and credible explanation from the organizations involved, I think it may be inevitable for those who care about Wikipedia and its future to engage in this kind of damaging speculation.

Even though major questions remain about the Belfer program itself, we can draw an important conclusion about the Wikimedia Foundation’s reporting on the program to date:

Thus far, the WMF has not given Wikipedia’s stakeholders (including readers, organizations contemplating a WIR, and Wikipedia volunteers) a clear and thorough account of what happened, and why the significant decisions were made.

What were the goals that drove the WMF to dedicate staff time to this project, and attach its valuable trademarks to the job posting? How did the goals relate to the movement’s five year strategic plan? Were the goals accomplished? What unanticipated results occurred? Overall, what were the effects on the quality of Wikipedia? on the strength of the Wikipedia brand among the people and organizations who share our vision, which supports Wikipedia’s increasing reach? on the health of the Wikipedia community – will this increase participation? And in this case – what is the impact on the effectiveness of the Wikimedia Foundation, an organization whose central purpose is to support the work of a huge and deeply passionate community?

These are the kinds of questions the Wikimedia Foundation rightly expects its grantees, program staff, and chapter organizations to consider. But they have not been addressed in the Wikimedia Foundation’s own report of the Belfer Center Wikipedian in Residence program.

by Pete Forsyth at April 08, 2014 12:01 AM

April 07, 2014

Tech News

Tech News issue #15, 2014 (April 7, 2014)

TriangleArrow-Left.svgprevious 2014, week 15 (Monday 07 April 2014) nextTriangleArrow-Right.svg
Other languages:
العربية 100% • ‎čeština 100% • ‎Deutsch 33% • ‎English 100% • ‎español 93% • ‎suomi 100% • ‎français 100% • ‎עברית 93% • ‎italiano 93% • ‎日本語 47% • ‎한국어 33% • ‎Nederlands 100% • ‎polski 100% • ‎русский 100% • ‎සිංහල 100% • ‎українська 93% • ‎中文 100%

April 07, 2014 12:00 AM

April 05, 2014

Gerard Meijssen

#Wikipedia - Esmonde and Larbey

Esmonde and Larbey are two British authors. They were were a British television comedy script writing duo from the 1960s to the 1990s, creating popular situation comedies such as Please Sir! and The Good Life.

Mr Esmonde died in 2008 and Mr Larbey died recently on March 31. Such a combination of characters may make sense in an article, for Wikidata they have to be two distinct items. How else can you identify Mr Larbey as one of the recently dearly departed ?

Given that Wikipedia has them as one single article, Wikidata identifies Mr Esmonde and Mr Larbey as part of this duo.
Thanks,
      GerardM

by Gerard Meijssen (noreply@blogger.com) at April 05, 2014 08:52 AM

April 04, 2014

Jeroen De Dauw

Wikidata Code Review 2014

One year ago we had the Qafoo guys come into the Wikimedia Deutchland office to review the software we had written for the Wikidata project. There is a summary of the review as well as a big PDF with all the details.

This week I presented a follow up review to the Wikidata team. The primary goal of this review was to make changes since the last review visible, and to suggest on how to improve things further. Check out the slides of the review.

The first part of the presentation looks at some simple code metrics for Wikibase.git and how they changed over the last year. After that the main part of the presentation starts. This part looks at individual points from the 2013 review, the progress we made on them, and what can still be done. The end of the presentation looks at how we tackled the action items from the 2013 review, or rather how we did not, and also lists a number of important improvements we did make while not being mentioned in that review.

by Jeroen at April 04, 2014 04:08 PM

Wikimedia Tech Blog

A young developer’s story of discovery, perseverance and gratitude

This post is a discovery report written by Jared Flores and slightly edited for publication. It’s part of a series of candid essays written by Google Code-in students, outlining their first steps as members of the Wikimedia technical community. You can write your own.


When I initially heard of the Google Code-In (GCI) challenge, I wasn’t exactly jumping out of my seat. I was a little apprehensive, since the GCI sample tasks used languages such as Java, C++, and Ruby. While I’ve had my share of experience with the languages, I felt my abilities were too limited to compete. Yet, I’ve always had a fiery passion for computer science, and this challenge presented another mountain to conquer. Thus, after having filtered through the hundreds of tasks, I took the first step as a Google Code-In student.

The first task I took on was to design a share button for the Kiwix Android app, an offline Wikipedia reader. Though Kiwix itself wasn’t a sponsoring organization for GCI, it still provided a branch of tasks under the Wikimedia umbrella. With five days on the clock, I researched vigorously and studied the documentation for Android’s share API.

After a few hours of coding, the task seemed to be complete. Reading through the compiler’s documentation, I downloaded all of the listed prerequisites, then launched the Kiwix autogen bash file. But even with all of the required libraries installed, Kiwix still refused to compile. Analyzing the error logs, I encountered permission errors, illegal characters, missing files, and mismatched dependencies. My frustration growing, I even booted Linux from an old installation DVD, and tried compiling there. I continued this crazy cycle of debugging until 2 am. I would have continued longer had my parents not demanded that I sleep. The next morning, I whipped up a quick breakfast, and then rushed directly to my PC. With my mind refreshed, I tried a variety of new approaches, finally reaching a point when Kiwix compiled.

With a newly-found confidence, I decided to continue pursuing more GCI tasks. Since I had thoroughly enjoyed the challenge presented by Kiwix, I initially wanted to hunt down more of their tasks. However, finding that there weren’t many left, I gained interest in Kiwix’s supporting organization: Wikimedia. I navigated to Wikimedia’s GCI information page and began familiarizing myself with the organization’s mission.

“We believe that knowledge should be free for every human being. We prioritize efforts that empower disadvantaged and underrepresented communities, and that help overcome barriers to participation. We believe in mass collaboration, diversity and consensus building to achieve our goals. Wikipedia has become the fifth most-visited site in the world, used by more than 400 million people every month in more than 270 languages.” – About Us: Wikimedia (GCI 2013)

Reading through the last sentence once more, I realized the amazing opportunities that were ahead of me. Whenever I needed to touch up on any given topic, Wikipedia was always one of the top results. Moreover, Wikipedia had become a source of entertainment for me and my friends. We always enjoyed hitting up a random article, then using the given links to find our way to Pokémon, Jesus, or maybe even Abraham Lincoln: Vampire Hunter.

Eager to begin, I chose video editing as my second task for Wikimedia. I began the long endeavor of watching, reviewing, and editing the two forty-five minute clips. Despite the lengthy videos, I was quite amused in seeing the technical difficulties that the Wikimedia team encountered during their Google Hangout. It was also comforting to put human faces behind the Wikimedia mentors of Google Code-In.

As with my first task, the work itself sped by quickly. But also similar to Kiwix, I encountered some difficulties with the “trivial” part of the task. I had never worked with the wiki interface before, so the wiki structure was somewhat foreign. I only had a vague idea of how to create a page. I also didn’t know where to upload files, nor did I know how to create subcategories. Nonetheless, after observing the instructions in Wikipedia’s documentation, I finally managed to upload the videos. Marking the task as complete, I scouted for my third GCI task.

Unbeknownst to me, my third task for Wikimedia would also prove to be the most challenging so far. Since this task required me to modify the code, I requested developer access. With the help of Wikimedia’s instructions, I registered myself as a developer, generated a private key to use with their servers, and proceeded to download the source code.

Though my experience with Git was quite basic, MediaWiki provided an easy to follow documentation, which aided greatly in my efforts to download their repository. As I waited for the download to complete, I quickly set up an Apache server for a testing environment. Configuring the MediaWiki files for my server, I began the installation. Fortunately, MediaWiki’s interface was quite intuitive; the installer performed flawlessly with minimal user input.

“Off to a good start,” I chuckled quietly to myself, a grin spreading across my face. And with that statement I tempted fate and my troubles had begun. Upon opening the code, I realized I couldn’t easily comprehend a single line. I had worked with PHP but the code was more advanced than what I had written before.

Running my fingers through my hair, I sighed in exasperation. I spent the next few hours analyzing the code, trying my best to decipher the functions. Suddenly, patterns began appearing and I began to recognize numerous amounts of functions. I started to tinker with different modules until the code slowly unraveled.

Finally formulating a solution, my fingers moved swiftly across the keyboard, implementing the code with ease. Confident that I had tested my code well, I followed the instructions written in the GCI’s task description, and uploaded my very first patch to Gerrit.

I was surprised at how simple the upload was. But what especially surprised me was the immediate feedback from the mentors. Within a few minutes of the upload, MediaWiki developers were already reviewing the patch, making suggestions for improvement.

Thankful for their helpful input, I worked to implement the changes they suggested. Adding the finishing touches, I was ready to upload another patch. However, I was unsure if I should upload to a new Gerrit, or if I should push to the same patch as before. Unclear about the step I should take, I made the rookie error of uploading to a new Gerrit commit.

My mistake quickly received a corrective response from Aude via the Gerrit comment system. While I initially felt embarrassed, I was also relieved that I didn’t have to work alone. In fact, I was thankful that the MediaWiki collaborators taught me how to do it right.

Checking out the link Aude had given me, I learned to squash the two commits together. However, when I tried to follow Aude’s instructions, I somehow managed to mix someone else’s code with my own. What’s even worse was I already pushed the changes to Gerrit, exposing my blunder publicly.

Had it been any normal day, I would’ve just been calm and tried my best to fix it. But it just so happened to be the Thanksgiving holiday (in the United States). I had to leave in a few minutes for a family dinner and I couldn’t bear the thought of leaving my patch in a broken state.

I felt about ready to scream. I abandoned my Gerrit patch, and navigated to the task page, ready to give up. But just as I was about to revoke my claim on the task, I remembered something Quim Gil had told another GCI student:

“They are not mistakes! Only versions that can be improved. Students learn in GCI, and all of us learn every day.”

Remembering this advice, I cleared my mind, ready to do whatever it would take, and learn while I was at it. And like an answer to my prayers, Hoo Man, another developer, posted a comment in Gerrit. He guided me through how I could return to my original patch and send my new improvements through. And more importantly, he motivated me to persevere.

I came into GCI as a passionate, yet undisciplined student. I’m thrilled that in joining this competition, the Wikimedia open source community has already helped me plant the seeds of discipline, perseverance, and collaboration. It’s no coincidence that my hardest task thus far was staged on Thanksgiving. Every year I express gratitude towards friends and family. But this year, Google Code-In and the Wikimedia community have made my gratitude list as well.

Jared Flores
2013 Google Code-in student


Read in this series:

by Guillaume Paumier at April 04, 2014 03:55 PM

Gerard Meijssen

#Wikidata - Bartolomeo Pepe

Mr Pepe is the first Italian who has the OpenPolis property. #Reasonator shows a link to an external website where information about Italian politicians can be found. Mr Pepe is a member of the senate of the Republic.

The obvious thing to do is to link all existing articles of Italian politicians to OpenPolis. After that there are many "opportunities":

  • add an item in Wikidata for missing Italian politicians
  • add statements that complete the information about them
  • seek images to illustrate their articles and items
It will be fun to see what they will come up with when they have added all the missing information in Wikidata. Adding the OpenPolis property acknowledges that an item, an article is about an Italian politician, it is not providing information in a way that can be understood in languages other than Italian.
Thanks,
      GerardM

by Gerard Meijssen (noreply@blogger.com) at April 04, 2014 02:00 PM

Wikimedia Tech Blog

Migrating Wikimedia Labs to a new Data Center

As part of ongoing efforts to reduce our reliance on our Tampa, Florida data center, we have just moved Wikimedia Labs to EQIAD, the new data center in Ashburn, Virginia. This migration was a multi-month project and involved hard work on the part of dozens of technical volunteers. In addition to reducing our reliance on the Tampa data center, this move should provide quite a few benefits to the users and admins of Wikimedia Labs and Tool Labs.

Migration objectives

We had several objectives for the move:

  1. Upgrade our virtualization infrustructure to use OpenStack Havana;
  2. Minimize project downtime during the move;
  3. Stop relying on nova-network and start using Neutron;
  4. Convert the Labs data storage system from GlusterFS to NFS;
  5. Identify abandoned and disused Labs resources.

Upgrade and Minimize Downtime

Wikimedia Labs uses OpenStack to manage the virtualization back-end. The Tampa Labs install was running a slightly old version of OpenStack, ‘Folsom’. Folsom is more than a year old now, but OpenStack does not provide an in-place upgrade path that doesn’t require considerable downtime, so we’ve been living with Folsom to avoid disrupting existing Labs services.

Similarly, a raw migration of Labs from one set of servers to another would have required extensive downtime, as simply copying all of the data would be the work of days.

The solution to both 1) and 2) was provided by OpenStack’s multi-region support. We built an up-to-date OpenStack install (version ‘havana’) in the Ashburn center and then modified our Labs web interface to access both centers at once. In order to ease the move, Ryan Lane wrote an OpenStack tool that allowed users to simultaneously authenticate in both data centers, and updated the Labs web interface so that both data centers were visible at the same time.

At this point (roughly a month ago), we had two different clouds running: one full and one empty. Because of a shared LDAP back-end, the new cloud already knew about all of our projects and users.

Two clouds, before migration

Then we called on volunteers and project admins for help. In some cases, volunteers built fresh new Labs instances in Ashburn. In other cases, instances were shut down in Tampa and duplicated using a simple copy script run by the Wikimedia Operations team. In either case, project functions were supported in both data centers at once so that services could be switched over quickly and at the convenience of project admins.

Two clouds, during migration

As of today, over 50 projects have been copied to or rebuilt in Ashburn. For those projects with uptime requirements, the outages were generally limited to a few minutes.

Switch to OpenStack Neutron

We currently rely on the ‘nova-network’ service to manage network access between Labs instances. Nova-network is working fine, but OpenStack has introduced a new network service, Neutron, which is intended to replace nova-network. We hoped to adopt Neutron in the Ashburn cloud (largely in order to avoid being stuck using unsupported software), but quickly ran into difficulties. Our current use case (flat DHCP with floating IP addresses) is not currently supported in Neutron, and OpenStack designers seem to be wavering in their decision to deprecate nova-network.

After several days of experimentation, expedience won out and we opted to reproduce the same network setup in Ashburn that we were using in Tampa. We may or may not attempt an in-place switch to Neutron in the future, depending on whether or not nova-network continues to receive upstream support.

Switch to NFS storage

Most Labs projects have shared project-wide volume for storing files and transferring data between instances. In the original Labs setup, these shared volumes used GlusterFS. GlusterFS is easy to administer and designed for use cases similar to ours, but we’ve been plagued with reliability issues: in recent months, the lion’s share of Labs failures and downtime were the result of Gluster problems.

When setting up Tool Labs last year and facing our many issues with GlusterFS, Marc-Andre Pelletier opted to set up a new NFS system to manage shared volumes for the Tool Labs project. This work has paid off with much-improved stability, so we’ve adopted a similar system for all projects in Ashburn.

Again, we largely relied on volunteers and project admins to transfer files between the two systems. Most users were able to copy their data over as needed, scping or rsyncing between Tampa and Ashburn instances. As a hedge against accidental data loss, the old Gluster volumes were also copied over into backup directories in Ashburn using a simple script. The total volume of data copied was around 30 Terabytes; given the many-week migration period, network bandwidth between locations turned out not to be a problem.

Identify and reclaim wasted space

Many Labs projects and instances are set up for temporary experiments, and have a short useful life. The majority of them are cleaned up and deleted after use, but Labs still has a tendency to leak resources as the odd instance is left running without purpose.

We’ve never had a very good system for tracking which projects are or aren’t in current use, so the migration was a good opportunity to clean house. For every project that was actively migrated by staff or volunteers, another project or two simply sat in Tampa, unmentioned and untouched. Some of these projects may yet be useful (or might have users but no administrators), so we need to be very careful about prematurely deleting them.

Projects that were not actively migrated (or noticed, or mentioned) during the migration period have been ‘mothballed’. That means that their storage and VMS were copied to Ashburn, but are left in a shutdown state. These instances will be preserved for several months, pending requests for their revival. Once it’s clear that they’re fully abandoned (in perhaps six months), they will be deleted and the space reused for future projects.

Conclusions

In large part, this migration involved a return to older, more tested technology. I’m still hopeful that in the future Labs will be able to make use of more fundamentally cloud-designed technologies like distributed file shares, Neutron, and (in a perfect world) live instance migration. In the meantime, though, the simple approach of setting up parallel clouds and copying things across has gone quite well.

This migration relied quite heavily on volunteer assistance, and I’ve been quite charmed by how gracious the vast majority of volunteers were about this inconvenience. In many cases, project admins regarded the migration as a positive opportunity to build newer, cleaner projects in Ashburn, and many have expressed high hopes for stability in the new data center. With a bit of luck we’ll prove this optimism justified.

Andrew Bogott, DevOps Engineer

by Andrew Bogott at April 04, 2014 01:06 PM