Search this website:


Conditions Change. Adapt Faster. Learn more > Conditions Change. Adapt Faster. Learn more >

BI Badlands – The Road to Nowhere

By Geoff Sheppard, VP Sales EMEA at Actian Corporation.

 

Date: 20150729

Recently, I’ve noticed it’s difficult to have a reasonable conversation with CIOs and CFOs without it turning into either a diatribe or a tribute about big data and how to tackle it. In a rapidly evolving market space, any new concept that promises new business insight from data is eagerly seized by organizations. The same is true of big data; organizations are treating it like a magic potion that can unleash a new opportunities and create a new audience.

There is no doubt in my mind that big data, if handled properly, has the potential to create new business opportunities, especially if the solution provides fast and actionable business intelligence. The crucial question here is whether organizations are taking the right route to tackle the growing mass of information in order to successfully unlock the treasure trove - without meandering for years in a maze of dead-end BI solutions.

In fact, efforts to tackle big data have created a division in the market space, splitting it into two tiers: the organizations that have the resources to implement a data analytics solution and those that don't. In recent years, we have seen increasing number of organizations deploying advanced but highly expensive data analytics technologies, such as in-memory analytics and in-database analytics, both of which promise faster and more flexible business intelligence. The premise of these solutions is that performance is gained through the use of more hardware (disk, memory); by investing in more hardware, organizations can get the analytical performance required to deal with large and complex data volumes.

Following this line of thinking, we have seen countless examples of companies investing years and massive amounts of money in “old” database technologies such as Oracle, SQL Server and DB2. When these are unable to perform and deliver reporting and analytics in fast timeframes, companies are convinced that they should throw good money after bad, buy more hardware and in some instances even spend six-figure sums on high-end solutions such as Teradata or Oracle Exadata because they are convinced that in-memory and in-database analytics will solve their problem. But the opposite is true as companies are left floundering to find the capital to pay for these expensive solutions, not to mention additional resources that are required for implementing, running, cooling and maintaining them. In my opinion, this is an expensive, road to nowhere on which many companies and large-scale enterprises continue to travel. It would appear that there is no easy way of getting off it.

What about organizations for which most data analytics solutions currently available in the market are just too expensive? Just because the solutions are out of their price bracket, do they deserve to be left behind and denied the opportunity that can not only enable them to compete with big players with deep pockets, but also give them a competitive advantage that will allow them to set themselves apart and make powerful gains? This underserved majority deserves better, so it is time to create a diversion and get them off the road to nowhere.

Instead of taking the go-faster stripes route, I believe that the ideal solution for both sets of organizations is to stop doing the same old, same old and think differently about the types of solutions they need. Throwing more hardware (disk, memory) at the problem is not the solution. It’s time to take a look at solutions such as Actian’s Vectorwise, which we believe to be the fastest and the most cost-effective database software in the world, where companies benefit from the performance from on-chip vector processing on just one single server instead of having to rely on disk storage and RAM in tons of pizza boxes in a datacentre. Within the last year, Vectorwise has broken every TPC-H benchmark it has entered for performance, price/performance and energy efficiency making it one of the world's most cost-effective analytical databases.

In conclusion, the adage of “if you keep on doing the same, you keep on getting the same” is as true as it ever was and more so when it comes to throwing hardware or expensive solutions to a common problem. Indeed, if companies want to continue on the road to nowhere, there is little that anyone can do. The savvy organizations will be those that think differently and allow themselves to be diverted off the road, onto paths where more adept database technology will lead them to the road of analytical enlightenment instead. And that’s where CIOs and CFOs should be with their thoughts.
 



ShareThis

« Previous article

Next article »

Tags: Applications

More Exclusive News

Network access control - changed for the better

27 Jul 2015
Christian Buecker, CEO, macmon secure gmbh, says network access control (NAC) is needed now more than ever. And what’s more, it really has changed. From the cumbersome, expensive systems of o...

How the agile businesses will win

27 Jul 2015
By Daniel Naor, VP International Sales at Zadara Storage.

IoT helps financial services re-engage their customers

27 Jul 2015
By Matthew Larbey, Director of Product Strategy, VIRTUS Data Centres.

Meeting future standards

27 Jul 2015
Future-proofing investments with clever cabling. By Dr. Thomas Wellinger, Market Manager Data Centre R&M.

The counterintuitive CIO: The more advanced the business, the simpler the IT  

27 Jul 2015
By Nigel Moulton, CTO, EMEA – VCE.

Prevention: Survival of the fittest as data centre cyber threats evolve

27 Jul 2015
By Mike Langley, Regional Vice President, Western Europe and South Africa at Palo Alto Networks.

Six network security checks to mitigate the risk of data security breaches

20 Jul 2015
By Luke Potter, Operations Manager, SureCloud.

Control your HVAC motors anywhere!

20 Jul 2015
When Global Building Controls was asked to produce an effective VSD system that would help to keep energy costs under control for a data centre client, the company chose VACON 100X drives as the basis of its solution.

Are you energy efficient?

20 Jul 2015
By David Wilcox, Data Centre and ITaaS General Manager: Europe, Dimension Data.

A question of trust  

20 Jul 2015
By Jonathan Birch, Product Manager for Infrastructure and Data Services at Redcentric.

Improvements instead of incidents: optimize and align IT services

20 Jul 2015
By Per Bauer, Director of Global Services at TeamQuest, a global leader in IT Capacity Planning and Management solutions.

What does DR have in common with the Tour de France?

20 Jul 2015
By Marc Goroff, CTO at Quorum.

How to be a security policy management saint, not a sinner

13 Jul 2015
The path to policy righteousness demands the right processes, visibility and automation – but the rewards are improved security and better business agility. By Nimmy Reichenberg, VP marketing & strategy for AlgoSec.

Tallac Networks & Milton Keynes Council – a case study

13 Jul 2015
By Steve Broadhead, Broadband-Testing (www.broadband-testing.co.uk)

Computacenter enriches IT support experience and increases staff productivity with Next Generation Service Desk

13 Jul 2015
As part of its digitisation strategy, Computacenter needed to rethink its approach to delivering IT support services to both its employees and customers. The new approach would need to deliver an e...

Recruitment

Latest IT jobs from leading companies.

 

Click here for full listings»