Support Us

You are browsing the archive for Open GLAM.

Announcing a new series, “Curator’s Choice”

Adam Green - July 3, 2013 in Featured, Open GLAM, Public Domain, Public Domain Review

This week sees the launch of the “Curator’s Choice” series – a joint endeavour of The Public Domain Review and OpenGLAM – which aims to actively engage with and celebrate those cultural heritage institutions that have taken the exciting steps to open up their content.

This new series shall consist of a monthly guest post from a gallery, library, archive or museum curator reflecting upon a group of works in one of their open digital collections – that is, public domain material which has had no restrictions placed on it as it’s been digitised and made available online. This new series aims to be a celebratory spotlight on both the institutions making the exciting steps of openly licensing their digital collections and also the curators that work everyday with such collections – as well, of course, as being a celebration of the content itself.

The series shall be housed on both The Public Domain Review and the OpenGLAM website.

The inaugural post is from the British Library’s Phil Hatfield and Andrew Gray who take a look at the fascinating array of photographs in the British Library’s Canadian Colonial Copyright Collection – see it here on The Public Domain Review and here on OpenGLAM.

Here’s a sneak preview of just a few of the gems from the collection.

‘The Wrestlers’, deposited in 1905 by R. H. Trueman [copyright number 15767]. We all hope the bear was trained… – Source

Part of a series of stereoscopic photographs telling the story of Mr. Turtledove’s fancy for the French cook. Deposited in 1906 by Arthur Lawrence Merrill [copyright number 17212] – Source

Part of a photographic series on performing animals, deposited by John A. Brown in 1920 – Source

Next month’s piece shall be from the Rijksmuseum. You can follow new additions to the series through this RSS feed.

Announcing the Open Humanities Award Winners

Sam Leon - May 8, 2013 in Featured, Open GLAM, Open Humanities


Earlier this year, as part of the DM2E project, we put out a call to humanities academics and technologists to see if they could come up with innovative ideas for small technology projects that would further humanities research by using open content, open data and/or open source.

We’re very pleased to announce that the winners are Dr Bernhard Haslhofer (University of Vienna) and Dr Robyn Adams (Centre for Editing Lives and Letters, University College London). Both winners will receive financial support to help them undertake the work they proposed and will be blogging about the progress of their project. You can follow their progress via the DM2E blog.

Award 1: Semantic tagging for old maps… and other things

Screen Shot 2013-05-07 at 11.02.15

The first Award goes to Dr Bernhard Haslhofer of Vienna University. His project will involve building on an open source web application he has been working on called Maphub.

Dr Haslhofer told us a little bit about the inspiration for his project:

“People love old maps” is a statement that we heard a lot from curators in libraries. This combined with the assumption that many people also have knowledge to share or stories to tell about historical maps, was our motivation to build Maphub.

In essence Maphub is an open source Web application that, first of all, pulls out digitized historical maps from closed environments, adds zooming functionality, and assigns Web URIs so that people can talk about them online. It also supports two main use cases:

(i) georeferencing maps by linking points on the map to Geonames locations; (ii) commenting on maps or map regions by creating annotations. While users are entering their comments, Maphub analyzes the entered text on the fly and suggests so-called semantic tags, which the user accepts or rejects.

Semantic tags appear like “normal” tags on the user interface, but are in fact links to DBpedia resources. In that way, the user links her annotations and therefore also the underlying historical map with resources from two open data sources. Besides consuming open data during the annotation authoring process, Maphub also contributes collected knowledge back as open data by exposing all annotations following the W3C Open Annotation specification. In that way, Maphub supports people in a loop of using and producing open data in the context of historical maps.

Dr Haslhofer looks forward to seeing how collaborations will blossom between these various web annotation systems:

We believe that people also love other things on the Web and that Web annotation tools should support semantic tagging as well. Therefore, we will make it available as a plugin for Annotorious. Annotorious is a JavaScript image annotation library that can be used in any Website, and is also compatible with the Open Knowledge’s Foundations’s Annotator.

Annotorious and Maphub have common origins and the Open Humanities will support us in unifing parallel development streams into a single, reusable annotation tool that works for digitized maps but also for other media. We will also conduct another user study to inform the design of that function for other application contexts.

Award 2: Joined Up Early Modern Diplomacy: Linked Data from the Correspondence of Thomas Bodley


The second award goes to Dr Robyn Adams of the Centre for Editing Lives and Letters, University College London. The project will re-purpose the open resource that Dr Adams has been building with a team of others: the Diplomatic Correspondence of Thomas Bodley.

The project will use ‘additional’ information that was encoded into the digitisation of early modern letters that took place at the Centre for Editing Lives and Letters. In the initial incarnation of the project this data which included biographical and geographical information contained within letters was not used (although it was encoded).

Dr Adams told us a little bit about what she plans on doing with the money from the Awards:

With the prize funding from the Open Humanities Awards, we propose to mine the data that was generated but not fully used in the first phase of the project. This data is a rich source of biographical and geographical information, the visualization of which evokes the complex and diverse texture of the late sixteenth-century European diplomatic and military landscape. Bodley’s position in The Hague as the only English representative on the Dutch Council of State put him at the centre of a heterogeneous nexus of correspondents a time long before the Republic of Letters burgeoned in the subsequent century.

The project will interrogate three data fields within the larger data set of Bodley’s diplomatic correspondence in order to generate visualizations; the network of correspondents and recipients, and the people and places mentioned within the letters. These visualizations will be incorporated into the project website, where they will enhance and extend the knowledge derived from the existing corpus of correspondence. The visualizations, which will have scope to be playful while drawn from scrupulous scholarship, will offer an alternative pathway for scholars and the interested public to understand that in this period especially, the political, university and kinship networks were fundamental to advancement and prosperity.

“In mapping the relational activity between data sets,” Dr Adams went on, “I hope to further illuminate and reanimate Bodley’s position within the Elizabethan compass. Furthermore, I hope to demonstrate that fruitful routes of enquiry can result if scholars commit to going the extra mile to encode and record data in their research that may not have immediate relevance to their own studies.”

We offer our heartiest congratulations to the both Dr Haslhofer and Dr Adams both of whom will be presenting their work at the forthcoming Web as Literature conference at the British Library and this year’s OKCon in Geneva. Follow the progress of the Awards recipients via the DM2E project website.

Open Data Fellowship with Metropolitan New York Library Council Announced

Sam Leon - April 24, 2013 in Open GLAM

logo (1)

Exciting news: The Metropolitan New York Library Council (METRO) in collaboration with the Open Knowledge Foundation’s OpenGLAM initiative and Wikimedia NYC have today unveiled the first ever Open Data Fellowship for cultural heritage institutions starting this summer. The paid 8-week placement will combine two roles:

  • Facilitator for institutions interested in pursuing broader open data initiatives
  • Wikipedian-in-Residence for member institutions in the METRO consortium

Position Details

Open Cultural Data role

New Open GLAM Logo

  • Assist membership and collaboration with existing open cultural data initiatives from around the OpenGLAM Network
  • Research and use open source tools for working with open cultural data for possible uptake by the library
  • Develop guides and manuals for GLAMS for working opening up data
  • Contribute to an emerging multi-institutional linked open data project as needed

Wikipedian-in-Residence role


  • Assist METRO membership in their understanding and use of Wikipedia
  • Provide training and guidance on Wikipedia/Wikimedia use and WikiProjects
  • Assist membership with releasing collection content into Wikimedia (or other) Commons
  • Organize and host at least one Wikipedia-related special event or workshop

Fellow is expected to document their experience through METRO, OpenGLAM, GLAM-Wiki, or other community channels.

Position Requirements

  • Must have experience creating or editing Wikipedia content, contributing to Wikimedia (or other) Commons, and/or using other open data platforms
  • Student (graduate or undergrad) preferred, but any qualified candidates will be considered
  • Experience working in GLAMs or other cultural heritage institutions is preferred
  • Some experience in user training or creating instructional resources is preferred
  • Must be a US citizen

About the Position

  • Stipend: $5000 for a full-time, 8-week term working a 35-hour week
  • Position Term: 8 weeks, start and end date flexible, but primarily during summer
  • Located: At METRO (57 E. 11th St. NYC); some possible work at member organizations (within New York City’s five boroughs and Westchester county)

How to Apply

Submit a cover letter (including your Wikipedia experience, username, and other skills you bring to the position) along with your resume and two references along with their contact information. Email the above in PDF format to Applications accepted through May 15, 2013. Questions may be directed to Jefferson Bailey,

What We Hope the Digital Public Library of America Will Become

Jonathan Gray - April 17, 2013 in Bibliographic, Featured, Free Culture, Open Content, Open GLAM, Open Humanities, Policy, Public Domain

Tomorrow is the official launch date for the Digital Public Library of America (DPLA).

If you’ve been following it, you’ll know that it has the long term aim of realising “a large-scale digital public library that will make the cultural and scientific record available to all”.

More specifically, Robert Darnton, Director of the Harvard University Library and one of the DPLA’s leading advocates to date, recently wrote in the New York Review of Books, that the DPLA aims to:

make the holdings of America’s research libraries, archives, and museums available to all Americans—and eventually to everyone in the world—online and free of charge

What will this practically mean? How will the DPLA translate this broad mission into action? And to what extent will they be aligned with other initiatives to encourage cultural heritage institutions to open up their holdings, like our own OpenGLAM or Wikimedia’s GLAM-WIKI?

Here are a few of our thoughts on what we hope the DPLA will become.

A force for open metadata

The DPLA is initially focusing its efforts on making existing digital collections from across the US searchable and browsable from a single website.

Much like Europe’s digital library, Europeana, this will involve collecting information about works from a variety of institutions and linking to digital copies of these works that are spread across the web. A super-catalogue, if you will, that includes information about and links to copies of all the things in all the other catalogues.

Happily, we’ve already heard that the DPLA is releasing all of this data about cultural works that they will be collecting using the CC0 legal tool – meaning that anyone can use, share or build on this information without restriction.

We hope they continue to proactively encourage institutions to explicitly open up metadata about their works, and to release this as machine-readable raw data.

Back in 2007, we – along with the late Aaron Swartz – urged the Library of Congress to play a leading role in opening up information about cultural works. So we’re pleased that it looks like DPLA could take on the mantle.

But what about the digital copies themselves?

A force for an open digital public domain

The DPLA has spoken about using fair use provisions to increase access to copyrighted materials, and has even intimated that they might want to try to change or challenge the state of the law to grant further exceptions or limitations to copyright for educational or noncommercial purposes (trying to succeed where Google Books failed). All of this is highly laudable.

But what about works which have fallen out of copyright and entered the public domain?

Just as they are doing with metadata about works, we hope that the DPLA takes a principled approach to digital copies of works which have entered the public domain, encouraging institutions to publish these without legal or technical restrictions.

We hope they become proactive evangelists for a digital public domain which is open as in the Open Definition, meaning that digital copies of books, paintings, recordings, films and other artefacts are free for anyone to use and share – without restrictive clickwrap agreements, digital rights management technologies or digital watermarks to impose ownership and inhibit further use or sharing.

The Europeana Public Domain Charter, in part based on and inspired by the Public Domain Manifesto, might serve as a model here. In particular, the DPLA might take inspiration from the following sections:

What is in the Public Domain needs to remain in the Public Domain. Exclusive control over Public Domain works cannot be re-established by claiming exclusive rights in technical reproductions of the works, or by using technical and or contractual measures to limit access to technical reproductions of such works. Works that are in the Public Domain in analogue form continue to be in the Public Domain once they have been digitised.

The lawful user of a digital copy of a Public Domain work should be free to (re-) use, copy and modify the work. Public Domain status of a work guarantees the right to re-use, modify and make reproductions and this must not be limited through technical and or contractual measures. When a work has entered the Public Domain there is no longer a legal basis to impose restrictions on the use of that work.

The DPLA could create their own principles or recommendations for the digital publication of public domain works (perhaps recommending legal tools like the Creative Commons Public Domain Mark) as well as ensuring that new content that they digitise is explicitly marked as open.

Speaking at our OpenGLAM US launch last month, Emily Gore, the DPLA’s Director for Content, said that this is definitely something that they’d be thinking about over the coming months. We hope they adopt a strong and principled position in favour of openness, and help to raise awareness amongst institutions and the general public about the importance of a digital public domain which is open for everyone.

A force for collaboration around the cultural commons

Open knowledge isn’t just about stuff being able to freely move around on networks of computers and devices. It is also about people.

We think there is a significant opportunity to involve students, scholars, artists, developers, designers and the general public in the curation and re-presentation of our cultural and historical past.

Rather than just having vast pools of information about works from US collections – wouldn’t it be great if there were hand picked anthologies of works by Emerson or Dickinson curated by leading scholars? Or collections of songs or paintings relating to a specific region, chosen by knowledgable local historians who know about allusions and references that others might miss?

An ‘open by default’ approach would enable use and engagement with digital content that breathes a life into it that it might not otherwise have – from new useful and interesting websites, mobile applications or digital humanities projects, to creative remixing or screenings of out of copyright films with new live soundtracks (like Air’s magical reworking of Georges Méliès’s 1902 film Le Voyage Dans La Lune).

We hope that the DPLA takes a proactive approach to encouraging the use of the digital material that it federates, to ensure that it is as impactful and valuable to as many people as possible.

We’re at SXSWi!

Sam Leon - March 8, 2013 in Open Data, Open GLAM


We’re pleased to say that the Open Knowledge Foundation will be at SXSWi this year.

I’ll be on speaking on the Culture Hack panel alongside my colleagues from Europeana and the Digital Public Library of America talking about why libraries, galleries, archives and museums should open up their content and data and what’s being done to build a vibrant cultural commons around the digital public domain.

But I’ll also be there to meet you folk interested in other aspects of the Open Knowledge Foundation. Whether its open clinical trials or open spending data or some other aspect of the digital commons you’re interested in, I want to hear what you’re up to at SXSWi and if there’s something I should pop along to. I’d also like to hear from anyone interested in setting up regional activity in the US around open knowledge. Whether you’re thinking about starting an initiative to open up city bus timetables or visualising state spending data I’d love to hear what you’re up to and how the Open Knowledge Foundation can help.

I’ll be at the Idea Drop House on Saturday 9th March from 2pm with the Culture Hack panelists. More info is on the schedule here and the Culture Hack panel takes place on Monday 11th March at 11am. More info on the panel can be found here.

I’ll be Tweeting from @Noel_Mas and receiving emails on – so let me know what’s going on!

First #OpenDataEDB of 2013

Ewan Klein - January 30, 2013 in Meetups, OKScotland, Open Data, Open GLAM, Open Government Data

The Edinburgh Open Data community started the year in fine style with a meet-up hosted by the National Library of Scotland on George IV Bridge. The turn-out was excellent, with a wide range of participants. As usual, we had a number of lightening talks.

The meet-up started with a welcome from Darryl Mead, Deputy National Librarian, who pointed out that openness was at the core of the NLS mission, and that work was underway to make information about the holdings easily accessible.

Amy Guy reported on her visit to the 1st International Open Data Dialog in Berlin, 5-6 December 2012. She was impressed by how successful the event was in demonstrating that Open Data is of practical value right now, rather than in some indeterminate future. Amy has a detailed blog post about the event.

Freda O’Byrne emphasised that small voluntary organisations (such as Play-Base,  Duddingston Field Group, and Scatterbox Films) can be hugely helped by access to the right kind of data, particularly when they need to write a case for further funding or when they are trying to network with other relevant organisations.

Recent developments in the approach to Open Data by the Scottish Government were described by Ben Plouviez (Head of Knowledge Information and Records Management). Some of the main challenges stem from cultural attitudes to data within the civil service; the cost of publishing open data on a sustainable basis; and the development of technical infrastructure such as URI sets. Areas where we can expect to see progress include increased sharing of data between different public institutions within Scotland; publishing dynamic datasets rather than isolated snapshots; and a better appreciation of the value of data analytics by managers within the Scottish public sector.

Expanding on Darryl’s introduction, Gill Hamilton described recent initiatives in Openness at NLS, including plans to appoint a Wikipedian in Residence, and the release of metadata for digital resources as Linked Open Data. Another issue under debate is whether it would be possible for NLS to provide open access to the digital resources themselves with loss of revenue.

Andy Wightman described current obstacles to answering the question “Who owns Scotland?“, highlighting the fact that members of the public are currently unable to view access information about land registration held by the Registers of Scotland without paying a fee. He had argued (unsuccessfully) during the course of the Land Registration etc. (Scotland) Act 2012, that access should be free (fee income accounts for only 5.3% of the Register’s revenue.) The wider debate about land taxation and land reform is hampered by the inadequate public availability of data on landownership.

It seemed as though lots of new connections were being made during the networking parts of the event, and some new collaborations were being hatched, possibly including a pilot project involving Scotland’s iconic Forth Rail Bridge.

Elevation and Plan drawing of the Forth Bridge, published within the Westhofen article on the construction of the Forth Bridge in Engineering, 1890, ©RCAHMS

Elevation and Plan drawing of the Forth Bridge, 1890, ©RCAHMS

The level of activity around Open Data in Scotland is definitely on the rise. A lot of events and initiatives are being planned, including the following:

Digital Public Library of America recommends CC0

Joris Pekel - January 22, 2013 in Open GLAM, Open Standards, Public Domain

The following post is cross-posted from the OpenGLAM blog.

On the OpenGLAM blog we have previously written about the Digital Public Library of America (DPLA), an initiative that has the goal to make the cultural and scientific heritage of humanity available, free of charge, to all. To achieve this goal the board of directors has recommended applying the CC0 public domain waiver to the metadata.

The Digital Public Library of America

The Digital Public Library of America

The vision of the DPLA is to provide one click access to many different resource types, with the initial focus on producing a resource that gives full text access to books in public domain, e.g. from Hathi Trust, the Internet Archive, and U.S and international research libraries. In order to create a true complete collection of available content and data, it is important that the DPLA makes its data interoperable with other datasets from initiatives and institutions all over the world. To make this work, the various datasets have to be compliant, both legally and technically.

Last week, the board of directors of the DPLA had a meeting and the metadata licensing policy was discussed.

The proposed policy is as follows:
  • The DPLA asserts that metadata are not copyrightable, and that applying a license to them is not necessary.
  • To the extent that the law determines a copyright interest exists, a CC0 license applies.
  • The DPLA asserts no new rights over metadata at the DPLA level.

This is also reflected in the about page of the DPLA

Metadata is a key part of the DPLA discovery framework; it describes content and resources in the DPLA, enables users to find them, and connects US holdings to holdings in other countries. The DPLA will aggregate existing library data and create new data; it will operate as part of a global linked data environment. All DPLA-created metadata will be made freely available in reusable form, except where doing so would violate personal privacy. All metadata contributed to or funded by the DPLA will be placed in the public domain.

Leaving Europe: a new life in America virtual exhibition

The decision to apply the CC0 Public Domain waiver to the metadata will greatly improve interoperability with Europeana, Europe’s equivalent of the DPLA. Now that more different initiatives start publishing digitised heritage and its metadata, interoperability becomes more and more important in order to create a linked web of cultural heritage data, instead of new data silos. By both choosing the CC0 Public Domain waiver, Europeana and the DPLA take a great a step forward in achieving their goal.

A first example of what can result out of this collaboration is the virtual exhibition “Leaving Europe: a new life in America” where the story of European immigrants is being presented, using material from both the US and Europe.

The DPLA will launch on April 18 this year in Boston.

Consequences, risks and side-effects of the license module “non-commercial use only”

Joris Pekel - January 8, 2013 in Featured, Open GLAM

In 2012, a group of German copyright experts released in collaboration with Wikimedia the German document “Folgen, Risiken und Nebenwirkungen der Bedingung Nicht-Kommerziell – NC” (Consequences, Risks, and side-effects of the license module Non-Commercial – NC). In this document, they explain all consequences of choosing a CC license variant restricted to non- commercial use only (NC) and make clear why its usage is often not necessary and even a bad idea for artists and institutions.

The public licenses developed by Creative Commons (CC) are tools to make creative works available for free use under certain conditions. As rights holders have different needs and motives, CC offers six different license variants. Some of the most popular license variants include the condition that the licensed works must not be used commercially. This has far-reaching and often unintended consequences for the dissemination of the respective works and sometimes even entirely thwarts what the licensor wants to achieve by choosing a CC license.
This brochure wants to offer information on consequences, risks and side-effects of the restrictive CC license variants that don‘t allow commercial use

As often discussed on the OKFN blog, the Creative Commons NC-license can not be considered a true open license as it is not mutually compatible with for example, material with a CC Attribution-Sharealike (BY-SA) license.

After reading this document which was published under a cc-by license we decided that it was worth it to create an English version as well. We put out a request to the German OKFN volunteers and got a couple of responses. Within a few days the complete document was translated. Then, the original authors were consulted and they agreed to proofread the document. This was also a great opportunity to implement some of the comments they received from the German Wikimedia community after publishing. With the help of Wikimedia Deutschland, we were able to fit the document in the same design as the original.

And now in early 2013, we are very happy to announce the final version of the document translated to English.

Download “Consequences, Risks, and side-effects of the license module Non-Commercial – NC” here.

Again we want to thank the OKFN community so much for achieving this great publication. Special thanks goes out to Thomas Hirsch who translated the majority of the document.

Want to help spread Open Knowledge in your own language? Join the Task Force!

Why the German Digital Library should learn from Europeana

Joris Pekel - December 13, 2012 in Featured, Open GLAM

The full version of this article is available on the Open GLAM blog.

picture Launch of the DDB. Jill Cousins, Hermann Parzinger, Elke Harjes-Ecker, Matthias Harbort (from left to right) – Photo: Julia Hoppen
On the 29th of November 2012, the beta version of the German Digital Library (DDB) was officially launched. After five years of preparation and discussions with a large number of cultural institutions, it was finally time to bring it to the public. Herman Parzinger, head of the Prussian Cultural Heritage Foundation, explained in the press-release:

“The goal of the Deutsche Digitale Bibliothek (DDB) is to offer everyone unrestricted access to Germany’s cultural and scientific heritage, that is, access to millions of books, archived items, images, sculptures, pieces of music and other sound documents, as well as films and scores, from all over Germany”

To reach this goal, a lot of work needs to be done. At the moment, around 5.5 million metadata records can be found in the portal. Around 3 million come from a single institution, the Landesarchiv Baden-Württemberg. Currently 90 institutions provide data to the library and the three biggest organisations make up more than 80% of all records. Goal of the DBB is to include the metadata records of more than 30.000 German cultural institutions.

In many ways, the German Digital Library reminds of the Europeana project when it was launched in 2008. At that time, France was responsible for about 50% of all records in the Europeana portal and many countries were not present at all. In the past four years, Europeana has managed to include data from each EU country, and continues expanding it (see visualisation).

The interface of the DDB is very similar to Europeana as well. A simple search box combined with the possibility to filter the results in many different ways, for example by content provider, period, or location. As Europeana, the DDB is a search portal which links the user to the actual digitised object on the institutions webpage. They only host the metadata.

picture Homepage of the German Digital Library

Unfortunately, one major difference with the current Europeana project is how the DDB deals with copyright. Europeana has recently released all of their metadata records under a CC0 public domain waiver, making all of their metadata records free to use and reuse by anybody for any purpose without any restrictions.

The German Digital Library is quite the opposite. Their Terms of Use state clearly that:

  1. The DDB and its data suppliers retain all copyright and other industrial property rights to the public, freely accessible (and free of charge) digital content, derivatives and metadata including layout, software and their content in line with the principle of „free access – rights reserved“.
  2. Any use of the digital content, derivatives and metadata for commercial purposes is hereby prohibited, unless approved by the title holder in individual cases.

These copyright restrictions make it very hard for users to do anything with the metadata from the DDB. Especially when the API is launched, it is practically impossible for developers to create something with it as they will constantly have to ask the hundreds of different institutions if it is allowed. When Europeana started, there was also no consensus how to deal with the rights of the aggregated metadata and it took them four years to solve this issue. Over the last couple of years, the European Union, Europeana itself, and many other organisations have released reports and documents that clearly outline the advantages of open data for cultural institutions, as well as for society and research.

It seems like a strange move that the DDB is so restrictive, especially as they are to become the official German aggregator to Europeana. Europeana has been very clear since last September that the rights of all the data provided have to be waived away by using the CC0 declaration. Furthermore, many objects from for example the Landesarchiv Baden-Württemburg can already be found on Europeana, under a free license.

With all of the world’s heritage is becoming available online, great new possibilities arise. Different collections can be connected and linked and institutions can enrich their own data with the use of others. The history of Germany can not only be found in German institutions, but all over the world. By combining these different collections, historians can create a much more sophisticated history and find new stories and insights. This can only be achieved if the licenses being used by the different institutions and aggregators allow this, and the DDB term of use clearly do not do this.

As the German Digital Library is still in a beta-version, much can change. They are a direct partner of Europeana so it seems very easy to learn from the experiences of Europeana and how decisions in the past about copyright have worked out for them. Europeana has shown that European institutions are willing to provide data that can be freely reused, why start the discussion all over again in Germany?

The Myth of European Term of Protection Harmonisation

Christina Angelopoulos - November 21, 2012 in Featured, Open GLAM, Public Domain, Public Domain Works, WG Public Domain

This blog post is based on Christina’s paper, “The Myth of European Term Harmonisation – 27 Public Domains for 27 Member States”. This is a shortened version of the post – the full version is available on the OpenGLAM blog.

Copyright is supposed to be a temporary right: once it has expired, works automatically fall into the public domain for free public access and enjoyment. The importance of this arrangement is especially essential today, in view of the opportunities that internet technologies offer for the online distribution and reuse of out-of-copyright works: electronic repositories of culture such as Europeana, Project Gutenberg or Google Books are currently attempting to digitise and make available online out-of-copyright works, while modern participatory culture means that even individual users can more easily share old works or incorporate them into their own creative output.

Public domain calculators are technical tools to help determine when a work falls into the public domain. The idea is to provide a measure of legal certainty to cultural heritage institutions, as well as the average user, that they are not inadvertently infringing creators’ copyright, allowing them to confidently work with material for which copyright has expired and thus helping to sustain a vibrant, up-to-date and functional public domain.

As has been mentioned before on this blog, as part of the EuropeanaConnect project, Kennisland and the Institute for Information Law (IViR) of the University of Amsterdam set about creating one such calculator, concentrating on the term of protection rules in Europe. With the final tool now ready and available online, below we shall lay out some of the main conclusions drawn during their building process on the intricacies and limitations of European term of protection rules.

Term Disharmonisation and its Causes

When does a work enter the public domain? In EU Member States, the answer to this question should at least be “at the same time” – the Term Directive, one of the first European copyright directives adopted, was intended to leave no room for national deviations from the harmonised European norm.

Nevertheless, careful examination of the Directive’s rules reveals that it has not entirely succeeded in this objective. The way in which the rules laid down by the Directive have been incorporated into national law has differed from Member State to Member State, leading to divergences of up to fifty years for particular works! As a result, the composition of the public domain differs from country to country, as works fall out of copyright on different dates in different EU Member States: the European public domain contracts and expands along the pattern set by national legislative quirks.

The construction of the Public Domain Calculators helped identify the following main sources of legislative variability in this area:

1. Inconsistent Terms = Inconsistent Term

Inconsistency in substantive legal terms is rife, leading inevitably to inconsistency in term calculations. A work may qualify as a work of joint authorship in one jurisdiction, a collective work in another and as a collection of two or more separate works in a third, producing totally different periods of protection. The European Commission has addressed this problem with the recent amendment of the Term Directive in September 2011 for co-authored musical works, but has left the problem in other areas looming.

2. Exceptions to Harmonisation

The next problem is the array of explicit exceptions within the Term Directive. These occur in three areas: transitional provisions preserving longer terms of protection already running in a Member State (of which there are plenty); moral rights, an area generally left untouched by European legislation; and related rights over subject matter originating from outside the EU.

3. Related Rights

The Term Directive limits itself to the related rights of performers, producers of phonograms, broadcasting organisations and producers of first fixations of films. But Member States are allowed to introduce or maintain other related rights whose term will be determined exclusively by national law.

A variety of such rights can be found across the EU, from non-original photographs (Austria, Denmark, Finland, Germany, Italy, Spain, Sweden) to the typographical arrangement of a published edition (Greece, Ireland, Spain, UK), producing a maze of different rights each with its own term of protection.

4. Incorrect Implementation

Finally, divergences between Member State rules might simply result from the incorrect implementation of the Term Directive. Although this is obviously a risk run with any harmonising attempt, the complicated calculations, hierarchy of rules and transitional provisions of the Term Directive do not lend themselves to smooth transposition.


The calculation of the term of protection ought to be a straightforward exercise that any copyright layperson (or at least those with enough copyright knowledge to be able to properly identify the applicable rights) should be able to confidently undertake. Yet this is far from the case.

This effect was illustrated in the Public Domain Calculators by the need for separate electronic tools, giving on occasion very different results, for each of the 30 jurisdictions covered, including the 27, ostensibly harmonised, EU Member States. This effectively illustrates the way in which the incomplete harmonisation of the term of protection increases the complexity of the calculation process in Europe x27! The Calculators are as a result accompanied by a broad disclaimer, explaining that they cannot replace the case-by-case assessment by a legal expert, while it is also for the above reasons that the very concept of automated calculation is warily approached by copyright experts.

But the problem lies not with the concept of electronic term of protection calculation in itself, but with outdated, badly harmonised and obscure rules that fail to live up to the requirements of the internet era, thus hampering end-users and cultural heritage organisations from taking full avail of the new opportunities now technically available. Certainly, the full harmonisation of European rules on the term of protection would not do away with the difficulties created by the current, particularly convoluted, calculation process – but it would go a very long way towards simplifying the requirements for rights clearance across the EU by replacing 27 sets of complicated rules with only one.

Readers are invited to give feedback on the Public Domain Calculator on the pd-discuss list.

Get Updates