Enigma In a chilly conference room at the San Francisco's Hyatt Regency on Monday, legal and digital security pros convened at USENIX's Enigma conference to hold forth on security, privacy, and related matters.
Following a discussion panel on encrypted messaging, the talk turned to mitigating the risks that come with using third-party code, external vendors, and crowdsourced advice.
Those risks became more apparent in the security problems spotted in a series of software libraries over the past few years.
In August last year, a Ruby software package called rest-client was found to be sending credentials to a remote server. In November, 2018, the NPM module event-stream was modified to steal cryptocurrency. There were similar incidents in July last year involving the NPM module electron-native-notify and in September, 2017, when the PyPI, the repository for Python software packages, was found to be hosting malicious software libraries.
While in theory no one should use anyone else's code without a thorough security review, that's impractical in the open source software ecosystem, where so many applications depend on code libraries written and maintained by third-parties and those libraries, in turn, depend on still more third-party libraries.
So the presenters explored ways to deal with risky trust relationships.
Filippo Valsorda, a cryptography engineer on the Go team at Google, offered an overview of the Go checksum database, a system deployed last year to provide a central log of Go module checksums - the values returned from a cryptographic hash function to verify the modules.
"We all use other people's code," he said. "Modern software development practices involve using third-party software that is made available through the open source ecosystem."
Valsorda explained that the Go team has attempted to design a system that ensures the integrity, availability, and provenance of third-party code. And he said the team had the benefit of seeing where other software repositories went wrong.
Go developers can use the go command client to verify the log entries stored in the Go checksum database, which stored checksums for all publicly-available Go modules. This doesn't guarantee that a library is free of malicious code, but it does ensure that the library hasn't been altered without authorization from its author.
Valsorda pointed to the left-pad incident - when the creator of an NPM module unpublished his code and mayhem ensured - to emphasize why code availability matters.
"The Go solution here is that there is a proxy protocol specified that allows you to fetch modules," he said. "And as long as the license of a certain module allows for distribution, we will hold on to the contents so that even if they get deleted, they will still be available for you to build."
There are privacy implications in Google's oversight of the central Go module database. These involve the possibility of exposing the text of private module paths and exposing how developers use public modules. Google has tried to reduce these privacy consequences by supporting proxy servers that other organizations can run on their own.
Companies, he said, "can run their own proxy, which will cache everything that has ever been used an organization and guarantee within the organization that everything will still be available in the future for as long as the internal infrastructure is accessible."
In the presentation that followed, Sarah Harvey, a security engineer for payments biz Square, examined the workflows organizations can use when integrating third-party vendor systems to reduce the risk of bad outcomes. She pointed to the 2014 hacking of Target's payment system through credentials that had been granted to its HVAC contractor as an example of the potential consequences of a third-party with too much network access.
Harvey described the integration flow that third-party vendors go through to connect to Square's systems. It basically involves filling out online forms that specify contextual information about vendors and their products, descriptions of the data being transferred, and the network domains required to make the relationship work. That information must then be translated into network and policy rules.
Because forms of this sort introduce friction that could discourage thorough disclosure, Harvey said she did a lot of work on the UX and UI design to auto-populate many of the data fields.
"You have to be very calculated amount of friction you're you are introducing and try to reduce it as much as possible to get people through the system," she said.
The third presentation on the topic of third-party trust involved Felix Fischer, a security researcher at Technical University of Munich, delving into the ups and downs of Q&A site Stack Overflow as a source of code examples. Fischer and others have penned papers [PDF] on the security consequences of relying on community-contributed code, but he had more in mind than rehashing past findings about the problem with copying-and-pasting insecure snippets into apps.
"Ninety-seven percent of apps that reuse code from Stack Overflow applied insecure code," he said. On the other hand, he said, some 70 per cent of code examples from the Q&A site incorporated helpful advice that applied security best practices. So good advice is available on Stack Overflow. However, only 6 per cent of Google Play apps reuse those code examples.
The reason that bad advice becomes more popular than good advice, he explained, has to do with the incentive structure of Stack Overflow, where people to earn reputation points by duplicating popular answers and reposting them.
"What we found was that over a third of the so-called highly-trusted users, users with a particularly high reputation score, posted insecure code," he said. "So all the very meaningful indicators on Stack Overflow were indeed pointing in the wrong direction."
Denying developers access to Stack Overflow won't help and would probably make things worse, said Fisher. He argues that behavioral science can be used to guide UX and UI modifications to Stack Overflow that help nudge developers to make the right security choices without taking away their freedom.
"We developed a nudge system based on deep learning that knows what suggested code examples are about and whether they're insecure or not," he said.
One way this was tested involved having the nudge system re-rank search results on Stack Overflow to present the most helpful and secure advice first. The system also warned about insecure advice within discussion threads while also always offering safe alternative solutions.
"Our nudging interventions did not harm productivity and significantly increased code security," he said.
In short, third-party code, third-party vendor relationships, and third-party advice have the potential to be harmful, but they don't have to be that way. ®
Fluent, fluent everywhere but not a patch that works
I'll take a Big Mac, large fries and... um, are you OK?
Unfortunate timing - the Obama admin also supported the database giant
And that's one hell of a privacy agreement
Linux Foundation hears your gripes about naming schemes, legacy code, and more
It's not a bug, it's a feature, explains the Chocolate Factory
PARC, Apple and Amazon - computing pioneer dies at 74