A massive data breach in the US state of Utah in March was caused by configuration errors, it has been revealed.
The attack was made possible by a series of configuration mistakes during an upgrade that left the server wide open to attackers, who downloaded data from the server on 30 March, according to the interim director of Utah’s Department of Technology Services (DTS), Mark Van Orden.
Utah’s former DTS director resigned last week in the wake of the data breach, which exposed the personal information of nearly 800,000 people to hackers believed to have been in Eastern Europe.
The server, he explained, was installed by an independent contractor and was not protected by a firewall during the upgrade. In addition, the server used factory-issued default passwords, which he said is not “routine”.
“Two, three or four mistakes were made,” Van Orden was quoted as saying. “Ninety-nine percent of the state’s data is behind two firewalls, this information was not. It was not encrypted and it did not have hardened passwords.”
Scott Crawford, research director at Enterprise Management Associates, said that organisations seem to struggle with defining management security objectives such as the change control policy for high-value assets, actually implementing those objectives in practice, monitoring the environment for compliance, detecting deviations and responding effectively when something unusual occurs.
“Lack of identifying high-value assets and prioritising monitoring and control in those environments often contributes to exposures,” he said. “Finer control over access privileges – implicated directly in the Utah case – is one example where such control can and should be scrutinised more carefully and more consistently enforced.”
Poor control over browsers are another example, he added, noting that many of today’s browsers enable sandboxing, validate code and provide other techniques to limit exposures.
“Many organisations find it difficult to keep such issues current, particularly with large numbers of widely distributed endpoints,” he said.
Oftentimes, people are more interested in making things work than making them work right, said Andrew Storms, director of security operations at nCircle.
“One of the most common configuration errors I see is running services with too many permissions,” he said. “For example, in UNIX, the Apache process is run as user www to limit exposure. If Apache were compromised and the process was running as an admin, then the attacker would gain full administrative access to the server.”
“Another common configuration error is altering file system permissions in order to make an application run,” he continued. “This is the quick and easy way out of file/folder access problems, but it’s better to ask why the application needs access to those files in the first place.”
“Even if [people] understand the implications of configuration errors, they take the easy way out,” he said.
How much do you know about smartphones? Take our quiz.
Troubled battery maker Northvolt reportedly considers Chapter 11 bankruptcy protection in the United States as…
Microsoft's cloud business practices are reportedly facing a potential anti-competitive investigation by the FTC
Ilya Lichtenstein sentenced to five years in prison for hacking into a virtual currency exchange…
Target for Elon Musk's lawsuit, hate speech watchdog CCDH, announces its decision to quit X…
Antitrust penalty. European Commission fines Meta a hefty €798m ($843m) for tying Facebook Marketplace to…
Elon Musk continues to provoke the ire of various leaders around the world with his…