.conf 2013 Keynote: Splunk Aims To Take The Analytics Market
Launch of Splunk Enterprise 6, Splunk Storm going free and the new Splunk Cloud were just some of the announcements made at the annual user conference
Splunk is rapidly developing into a leading platform in analytics, with a new version of the core software, fully-featured cloud service and organic Hadoop integration announced at .conf 2013, the fourth annual Splunk worldwide users’ conference in Las Vegas.
The event was attended by almost 2,000 IT professionals – about twice as many as last year’s .conf.
“Our goal is to innovate across all of our products so we can keep up with all the imaginative and pretty crazy use cases that you come up with,” said Godfrey Sullivan, CEO of Splunk.
The new Splunk Enterprise 6 software is available to download immediately, with a free licence capped to 500MB of data per day.
New toys
For most of the keynote, Sullivan left the talking to his lieutenants. Guido Shroeder, SVP of Products, took to the stage to talk about Splunk Enterprise 6, the latest version of the platform, launched under the tagline “operational intelligence for everyone”. It includes more than 2,000 new features that enable faster analytics via an improved user interface. Chief among the additions are new data models and the pivot tool.
“Marketing is making me call this ‘pivot’, but internally we call it ‘pivizzle’. It’s a brand new way to look at your data,” said Divanny Lamas, product manager at Splunk. “With data models and pivot you can take the effort of searching, do it once, and then save that effort later.”
According to Shroeder, the new interface allows users to be a lot more productive, with all-new home experience, reimagined search and integrated maps. Splunk Enterprise 6 also includes centralised, simplified cluster management for Splunk deployments at scale.
Meanwhile, the new High Performance Analytics Store allows the platform to deliver results up to 1,000 times faster than the previous versions of Splunk Enterprise.
“With this technology, we made reports and dashboards that took 20 minutes to run, run in under 10 seconds,” said Stephen Sorkin, Splunk CSO who had worked on the feature for the past two years. “It lets you take a dataset and make any report on that dataset faster. There’s no pre-computed aggregates, no data cubes, so the type of reporting that you can do is not limited in any way.
“We’ve seen our High Performance Analytics Store run at over a billion rows per second in distributed environments.”
Splunk all the things
Moving with the times, Splunk is making a serious effort to extend its platform to include mobile devices. Shroeder said that the recent acquisition of Greek start-up BugSense, the first since the company went public in 2012, was a key milestone in its strategy.
“Imagine having the Splunk engine at your fingertips, inside your mobile device. We can aggregate data, build really beautiful visualisations, and it’s going to be awesome,” said Jon Vlachogiannis, CTO of BugSense. “We’re going to add capabilities to our platform that are going to transform the mobile industry.”
BugSense will announce its product roadmap and new features in the next few months. And the importance of this acquisition cannot be overestimated – search your smartphone or tablet, chances are that a file called ‘BUGSENSE’ is already tucked away in its root directory.
Last year, Splunk launched its first cloud product, Splunk Storm, a service for analysis and troubleshooting of cloud applications. Today, Dejan Deklich, VP of Cloud and Server-Side Engineering, announced that Splunk Storm is adopting a free licence, complete with 20GB of storage. He also introduced the new Splunk Cloud – an enterprise-grade offering ready for production purposes that includes all the features of the Splunk Enterprise platform, currently hosted on Amazon’s servers.
In conclusion, Clint Sharp, director of Product Management, Big Data and Operational Intelligence went on stage to talk about Hunk, a “Hadoop on Splunk” application launched in June, currently still in Beta.
“This is unique in the marketplace. Generally, if I want to gain insight into my data, it requires a complicated set of integrations between multiple vendors’ toolsets. This is hard – you have to ETL your data, you have to get it into the right format. Hunk takes your raw data, and gives you analytics. It allows you to explore, visualise and analyse your data at rest in Hadoop,” explained Sharp.
“It works in the data you’ve already put in Hadoop, in whatever format you’ve put it there. And it’s intended for the largest-scale analytics cases you can throw at it, Terabytes or Petabytes of data. We’re democratising the data that you put in HTFS – it’s your data, you should have access to it.”
Where do you store all that data? Take our storage quiz!