When it comes to law and technology, the future is wide open. Will artificial intelligence transform the practice of law? Will electronic surveillance mean that nothing can be truly confidential? Will you ever be able to find a password that you can remember? Yes. No. Maybe.
The future of the law and technology is an unanswered question. Here are the ones we're asking.
Machine learning, the ability of software to improve itself over time, is becoming increasingly common. Self-teaching machines can now create recipes never seen before, win on Jeopardy, and even answer simple legal questions. But the technology poses major questions, for lawyers and for the law itself. Could a machine learning algorithm learn to subvert the law?
Artificial intelligence isn't just for science fiction anymore. Tech companies are making real advancements in AI, creating technology that can aide, and possibly replace, professionals like attorneys. And two AI products are already aimed at the legal market. How could they impact your practice?
Blockchain is the tech underlying "cryptocurrencies" like Bitcoin. But while Bitcoin might be for nerds and Australians, the blockchain could have major implications for encryption and authentication. And it's being touted as a potentially revolutionary technology for lawyers. Here's why.
You have a password for your personal email, for your work email, for your bank account, your newspaper, your gym's website. Each one requires something slightly different. Each one expires at a different time. There's no wonder you can't keep them straight. Passwords are horrible. Thankfully, biometrics might make passwords a thing of the past.
Should Apple be forced to decrypt its smartphones? Will changes in the law increase government snooping? Can law firms protect themselves against hackers? These are the debates that we cover all the time. And the typical answer is along the lines of, "If you want to be safe, you need more encryption. Better encryption." Here's why that answer might be, at least partially, wrong.