In Part 1 of this series, we provided an overview of data (or knowledge) commons and some key issues to consider, but how does one actually create and manage a data commons? To find your feet in this budding field, build on the theoretical foundation; address the specific context (including perceived objectives and constraints); deal with the thorny issues (including control and change); establish a core set of principles and rules; and, perhaps most importantly, plan for and enable change.
NEWS FOR LAWYERS AND SOURCING PROFESSIONALS
You may have heard of the “tragedy of the commons,” where a resource is depleted through collective action, but knowledge is different from other resources—knowledge can be duplicated, aggregated, integrated, analyzed, stored, shared, and disseminated in countless ways. Given that knowledge is a critical resource for seemingly intractable problems, the opportunity of the commons (or the tragedy of the lack of commons) is worth thoughtful consideration.
Imagine that you or a loved one is suffering from a terminal or debilitating disease and that data and knowledge are out there, waiting to be combined and harnessed for a cure or a transformational treatment. Imagine that self-interest (including attribution), legal restrictions (including intellectual property protections), inertia, complexity and difficulty of collective action, and other weighty forces are between you and that breakthrough discovery. Though not a new concept, commons have been garnering attention lately as an alternative framework for catalyzing groundbreaking research and development, particularly when relevant data and knowledge are scattered and particularly in the life sciences community. But before we all throw away our patents and data-dump our trade secrets, there are some thorny aspects to governing a data (or knowledge) commons. For example:
- A commons is essentially its own society. Anyone who has been part of a homeowners’ association knows that collective governance is almost always muddy. Aligning incentives, objectives, and values can be challenging.
- Founders may have trouble relinquishing control or enabling change. Participants may become confused or upset if rules or priorities change.
- Commons are not as well understood and tested. They must coexist with, and within, other systems that may be more rigid and rules-based. Participants may be logistically, intellectually, and otherwise tied to traditional methods and may prefer semi-exclusive zones rather than open collaboration.
- It may be difficult to measure the effectiveness or value of commons.
- Policing activities (e.g., authentications or restrictions) may be burdensome. And once the cat is out of the bag, it’s difficult to undo uses or disclosures.
- Commons managers may not be willing to take on certain responsibilities or liabilities that would make participants more comfortable.
- Different types of information and tools have different levels of sensitivity and protection. Certain information, like personal data, is highly regulated.
Scholars have taken theoretical frameworks built for natural resources and adapted them to the data commons setting. Key findings include that data commons must be designed to evolve and that communities with high levels of shared trust and values are most likely to succeed. Whereas governance through exclusivity (e.g., patents) is useful when trust levels are low, a resource sharing governance model (e.g., commons) can be effective when trust levels are high.
If you’d like to know more:
- We will be hosting a webinar with one of the aforementioned scholars—Professor Michael J. Madison, faculty director at PittLaw—on Tuesday, December 18, 2018, from 12:00 pm to 1:00 pm ET. Register and join us for the discussion.
- In a subsequent post, we will provide some tips and considerations with respect to drafting policies, standard terms, data contribution agreements, and other governing documents for data commons.
Knowledge sharing has long been an important element of academic research. And now collective sharing and governance of data assets throughout the scientific community, including for-profit participants, is gaining momentum. During their webinar, Out in the Open: The Knowledge Commons Framework, Emily Lowe, Ben Klaber, and Professor Michael J. Madison, faculty director at PittLaw, will discuss issues related to knowledge commons. Topics will include the following:
- A fundamental overview of knowledge commons, including the framework’s strengths and weaknesses
- Standard requirements regarding data contribution, access, use, sharing, protection, and attribution
- How to decide if a knowledge commons framework is right for your business, and if so, how to implement it successfully
Washington, DC partners Giovanna M Cinelli, Kenneth J. Nunnenkamp, and Stephen Paul Mahinka and Boston partner Carl A. Valenstein recently published a LawFlash on the recent action taken by the Committee on Foreign Investment in the United States (CFIUS) to implement a pilot program under the Foreign Investment Risk Review and Modernization Act (FIRRMA). FIRRMA, which was enacted in August 2018, reformed the CFIUS screening process for foreign investment in the United States and, among other things, permits CFIUS to establish pilot programs to test the viability of certain of its provisions. The LawFlash addresses the objectives and the scope of the announced pilot program, including the countries and types of investments covered by the program. It also describes the new requirement for mandatory declarations "for certain transactions involving investments by foreign persons in certain U.S. businesses that produce, design, test, manufacture, fabricate, or develop one or more critical technologies" implemented by the pilot program. The pilot program becomes effective November 10, 2018.
For more information on the pilot program, please read the LawFlash.
The Illinois Biometric Information Privacy Act (IBIPA) has been grabbing headlines of late as class action lawsuits under IBIPA’s private right of action are piling up, but an Illinois state appeals court recently held that a plaintiff “must allege some actual harm,” potentially stemming the flood of litigation.
Noting that biometric identifiers are biologically unique and permanent (unlike, for example, passwords) and thus irreparably problematic if compromised, IBIPA regulates the collection, retention, disclosure, and destruction of biometric identifiers and biometric information.
Under the statute, “biometric identifiers” are retina or iris scans, fingerprints, voiceprints, and hand or face geometry scans. Some exceptions, such as writing samples, written signatures, and physical descriptions, are specifically listed. The second category of regulated data, “biometric information,” broadly includes any information “based on an individual’s biometric identifier used to identify an individual.” Companies, therefore, can’t evade the purview of the law by converting a biometric identifier into a new identifier—say, a unique number.
On Thursday, June 22, Morgan Lewis partners W. Reece Hirsch and Mark L. Krotoski and associate Jacob J. Harper will discuss best practices for defending against data breaches involving protected health information. Topics will include the following:
- Implementing an effective security breach response plan
- Responding to the threat of ransomware such as WannaCry
- Lessons learned from recent Office for Civil Rights (OCR) enforcement actions
- What the HIPAA Phase 2 audits can tell us about OCR’s breach response expectations
On October 6, Federal Communications Commission (FCC) Chairman Tom Wheeler released a factsheet outlining proposed rules aimed at protecting broadband consumers’ privacy. The proposed rules would apply to internet service providers (ISPs) and cover data collection, usage, security, and breach notification.
If adopted, ISPs would need to notify their consumers about the types of data being collected, when and how collected consumer data can be shared, and the types of entities with which ISPs can share the information. ISPs would also be required to adopt reasonable measures to protect consumer data from data breaches and other vulnerabilities.
On October 13, partner Andrew Lipman will present a webinar, “The 2016 Election: Telecom, Media, and Tech Impacts.” The webinar will cover the pre- and post-election legal and regulatory landscapes applicable to the telecom, media, and technology industries.
Andrew will discuss various consequences of the election results on US Congress, federal courts, the Federal Communications Commission (FCC), the US Department of Justice, and other government agencies. He will also specifically cover the FCC’s policies on competition and antitrust, net neutrality, spectrum ownership, broadband deployment and adoption, consumer privacy, and data security.
The webinar will be held October 13, 2016 from 2:00-3:00 pm eastern. To learn more and to sign up, please visit the webinar’s event page.
As of September 30, Russian state authorities now reject tender submissions for supply of certain foreign electronic equipment if there are two concurrent submissions for supply of locally produced equipment. The ban applies to 113 types of equipment, including personal computers, printers, memory cards, mobile and landline phones, TV sets, cameras, microphones, and cash and ATM machines.
Electronic equipment may qualify as local if it is produced under a special investment contract between an investor and federal or regional government or if it is fully manufactured or significantly reprocessed in Russia. Some additional localization criteria specific for certain equipment also applies.
As part of our Sourcing and Technology Lunchtime Series, partners Michael Pillion and Peter Watt-Morse recently spoke during their webinar “The Next Frontier: How Robots and Automation are Changing Outsourcing and Technology Agreements.”
The webinar highlighted the emerging market for robotic process automation and artificial intelligence software and the adjustments to services, pricing models, and contractual provisions that arise from adopting this technology.