Security, Metrics and Big Data

BBC (UK) and Thoughtworks seem to be ahead on the technology horizon with an excellent thought provoking sessions on Big Data held at BBC Media City (Salford) and London.

Big Data is here to stay and so are the new ways of looking at Big Data. This means that the RDBMS could on its way out, or so it seems. Welcome NoSQL, MongoDb, Cassandra, Riak, Vertica  and a host of other open source databases with the ability to handle hundreds of millions of rows

Consider a large website having built in-house connectors to Apache to analyse the website data in real-time and uses that data to block suspicious activity on the site. Known as HTTPShark and using Vertica, this home grown open source engine enables low-latency response times while at the same time ingesting large volumes of data.

Would a CISO allow such low investment innovation that provides real-time data and helps in preventing malicious attacks and saving the reputation of the company or would the CISO prefer standard off-the-shelf tools that may measure up?

Can a CISO take advantage of all the information such as CPU, Disk I/O and other log information that can be correlated, aggregated and reported as a uniform event stream thereby providing a picture of the current availability of systems? This now requires the CISO to be able to have a hacker (not cracker) mindset to be able to put different streams together and join the dots.

Another emerging paradigm is continuous delivery, as business demands a faster turnaround for deployment can auditing for such scenarios match up or will auditors still demand staging servers and standard review mechanisms? Controls for minute-by-minute code upgrades across continents and polyglot persistence is here to stay and security and controls will have to be built around this to ensure compliance, confidentiality, availability and integrity.

A quick word of caution: It would be good to see the Hype-Curve for these technologies.

Further reading
Develop: north