Want to make creations as awesome as this one?

Transcript

Read more

Read more

Read more

Read more

Read more

Ken Dreifach Consent Walls

Jason Wool DNA

Nury Siekkinen Blockchain

Read more

Allison Bender AI & Chatbots

Zach Lerner Cryptocurrency

Jeff Landis Connected Cars

AI & Chatbots With the increasing use of AI and chatbots, the law will be playing catch-up to technology over the next year or so. Courts, regulators and policymakers will need to consider the degree to which companies will be held liable for statements made by a hacked chatbot. It’s one thing if a hacked chatbot begins spewing gibberish, another if it starts engaging in hate speech, and quite another if it provides inaccurate or harmful information to unsuspecting consumers. Companies relying on chatbots for any external communication function should consider how they would manage this type of incident and how to monitor the integrity of the technology. There are a range of potential consequences for not doing so. A hacked chatbot that is unresponsive or spews gibberish does not achieve the intended design goals and may reflect poorly on the company, potentially leading some consumers to question the level of quality in other products and services. A hacked chatbot that engages in hate speech may cause reputational harms for the company, as consumers may wonder which chat messages are legitimate, how the chatbot learned such speech, and why the company permitted that type of learning and/or speech. And chatbots could be used to amplify inaccurate or harmful messages in ways that negatively impact the company and its consumer. As we learn more about how social media may have been used to influence the recent U.S. presidential election, it raises important questions regarding the responsibilities of social media companies and users in identifying context in information shared on social media. Companies that use chatbots should be prepared to monitor the effectiveness and integrity of this technology in order to prevent and mitigate the problems that may arise.

Cryptocurrency The rise of cryptocurrencies—like Bitcoin—is raising fascinating legal questions. State and federal authorities are sorting out who should regulate the various aspects of cryptocurrency technology, focusing on issues like whether cryptocurrencies and initial coin offerings should be treated like securities and how they fit within the Internal Revenue Code. Additionally, those offering cryptocurrencies could potentially face claims of fraud in the form of class action lawsuits or regulatory investigations. The impact of cryptocurrency regulation, however, will extend beyond those offering and investing in the currencies. For example, merchants accepting cryptocurrencies as a form of payment may have trouble complying with anti-money laundering statutes and identity verification laws. Furthermore, regulators will scrutinize the advertisement of cryptocurrency sales, building on top of the SEC’s warning to celebrities about endorsement disclosure requirements. The range of issues that those who accept, use, or promote cryptocurrencies might encounter is nearly limitless.

Connected Cars The volume (and variety) of data that “connected cars” — cars capable of connecting to a network— generate will only increase in the coming year. This will likely increase the frequency with which law enforcement seeks such data as part of criminal investigations. It is not hard to imagine, for example, law enforcement wanting records of a car’s precise location, logs of calls made from the car, or even recordings of conversations had within the car for a criminal investigation. Depending on the car and the data, such requests may be directed to the car’s manufacturer, the provider of the relevant software, or even a cloud storage company. Any such entities that come into possession of data from connected cars will need to have a protocol for responding to law enforcement requests, and be aware of their rights—and obligations—with respect to the data being sought.

DNA We will soon have too much data to store efficiently using today’s technologies. Using DNA – the molecule that encodes genetic blueprints – to store electronic data may solve this problem. Instead of storing it in binary, data is stored using nucleotide pairs (adenine, cytosine, thymine, and guanine) that correspond to binary pairs. Assuming this experimental technology becomes commercially viable, it could present several tricky issues. Is the data genetic material? What if someone places it into their body? There is a slew of liability, IP protection, and other legal concerns that could arise from mixing big data with genomics. And, researchers have already successfully encoded malware into DNA, demonstrating that hacking will always keep pace with science. For now, this is more in the realm of thought experiment, but as companies begin to tackle the data storage conundrum they should be sure to consider all possible ramifications of emerging technologies that may at first glance appear to be a golden ticket.

Consent Walls and Paywalls Under GDPR The widely anticipated GDPR imposes elevated consent requirements on websites and others for certain activities. Many sites will need to obtain “freely given, specific, informed and unambiguous” consent for multiple third party ad platforms if they want to continue working with these platforms. (A new IAB-EU protocol for seeking collectivized consent is described here.) A dilemma, however, is that the GDPR lets users refuse consent, especially for data collection that isn’t necessary for the particular services. When users refuse consent, publishers incur risk by outright refusing to provide content, but (depending how certain GDPR provisions are interpreted and enforced) they may be allowed to incentivize consent through paywalls or similar mechanisms. Web publishers therefore will want to consider questions such as:

  • Whether and how to incentivize consent, e.g., through paywalls. “Premium” publishers with unique, popular content may be well-positioned to do this, if European regulators interpret the GDPR as permitting this,
  • Whether their ability to obtain consent for their partners might increase their value as partners,
  • Whether they should incentivize consent in another way – perhaps by piggybacking on platforms’ consent, and
  • Whether seeking additional or broader consents may lead to a decrease in overall consent percentages.
2018 may see services develop to accommodate this consent incentivization. And companies with strong content may be in high demand – as partners or even acquisition targets.

Blockchain Smart Contracts Blockchain is a highly secure and transparent bookkeeping system or ledger made up of digitally recorded and encrypted blocks of data, which are connected via the distributed network of computers storing the blocks. Because blockchain is distributed, immutable, and resistant to hacking, it carries the promise of a very high level of data integrity. One of blockchain’s most exciting applications goes beyond currency. Blockchain facilitates “smart contracts,” software that automatically executes a contract with no human intervention once coded conditions are met (for example, transfer funds from my account to a charity after it meets a certain funding goal). Smart contracts could replace or become part of many systems requiring secure recordkeeping—from escrow agreements to medical data—thereby eliminating costly intermediaries. But before using such contracts, companies will want to have some understanding of how such technology works, the benefits it provides, and the legal and technological requirements and obligations when using it.