Search this website:


BI Badlands – The Road to Nowhere

By Geoff Sheppard, VP Sales EMEA at Actian Corporation.

 

Date: 4 Jun 2012

Recently, I’ve noticed it’s difficult to have a reasonable conversation with CIOs and CFOs without it turning into either a diatribe or a tribute about big data and how to tackle it. In a rapidly evolving market space, any new concept that promises new business insight from data is eagerly seized by organizations. The same is true of big data; organizations are treating it like a magic potion that can unleash a new opportunities and create a new audience.

There is no doubt in my mind that big data, if handled properly, has the potential to create new business opportunities, especially if the solution provides fast and actionable business intelligence. The crucial question here is whether organizations are taking the right route to tackle the growing mass of information in order to successfully unlock the treasure trove - without meandering for years in a maze of dead-end BI solutions.

In fact, efforts to tackle big data have created a division in the market space, splitting it into two tiers: the organizations that have the resources to implement a data analytics solution and those that don't. In recent years, we have seen increasing number of organizations deploying advanced but highly expensive data analytics technologies, such as in-memory analytics and in-database analytics, both of which promise faster and more flexible business intelligence. The premise of these solutions is that performance is gained through the use of more hardware (disk, memory); by investing in more hardware, organizations can get the analytical performance required to deal with large and complex data volumes.

Following this line of thinking, we have seen countless examples of companies investing years and massive amounts of money in “old” database technologies such as Oracle, SQL Server and DB2. When these are unable to perform and deliver reporting and analytics in fast timeframes, companies are convinced that they should throw good money after bad, buy more hardware and in some instances even spend six-figure sums on high-end solutions such as Teradata or Oracle Exadata because they are convinced that in-memory and in-database analytics will solve their problem. But the opposite is true as companies are left floundering to find the capital to pay for these expensive solutions, not to mention additional resources that are required for implementing, running, cooling and maintaining them. In my opinion, this is an expensive, road to nowhere on which many companies and large-scale enterprises continue to travel. It would appear that there is no easy way of getting off it.

What about organizations for which most data analytics solutions currently available in the market are just too expensive? Just because the solutions are out of their price bracket, do they deserve to be left behind and denied the opportunity that can not only enable them to compete with big players with deep pockets, but also give them a competitive advantage that will allow them to set themselves apart and make powerful gains? This underserved majority deserves better, so it is time to create a diversion and get them off the road to nowhere.

Instead of taking the go-faster stripes route, I believe that the ideal solution for both sets of organizations is to stop doing the same old, same old and think differently about the types of solutions they need. Throwing more hardware (disk, memory) at the problem is not the solution. It’s time to take a look at solutions such as Actian’s Vectorwise, which we believe to be the fastest and the most cost-effective database software in the world, where companies benefit from the performance from on-chip vector processing on just one single server instead of having to rely on disk storage and RAM in tons of pizza boxes in a datacentre. Within the last year, Vectorwise has broken every TPC-H benchmark it has entered for performance, price/performance and energy efficiency making it one of the world's most cost-effective analytical databases.

In conclusion, the adage of “if you keep on doing the same, you keep on getting the same” is as true as it ever was and more so when it comes to throwing hardware or expensive solutions to a common problem. Indeed, if companies want to continue on the road to nowhere, there is little that anyone can do. The savvy organizations will be those that think differently and allow themselves to be diverted off the road, onto paths where more adept database technology will lead them to the road of analytical enlightenment instead. And that’s where CIOs and CFOs should be with their thoughts.
 



ShareThis

« Previous article

Next article »

Tags: Applications

More Exclusive News

Ten things you didn’t know about rack PDUs

20 Apr 2015
Do you want to accurately monitor and control energy usage in your IT installation? Do you want to enhance the reliability and availability of your IT systems? Do you want to cut cooling, administr...

Black Hole Routing does not equate to DDoS protection

20 Apr 2015
By Stephen Gates, Chief Security Evangelist for Corero Network Security.

Easynet delivers proactive service to a global customer base with streamlined patch management

20 Apr 2015
Easynet works with Shavlik in rolling out a centralised patch management solution for its global cloud services business.

DevSecOps: Taking a DevOps approach to security

20 Apr 2015
By James Brown, Director, Cloud Solutions Architecture.

The avalanche of data and exploding costs in the data centre

20 Apr 2015
How data virtualisation influences the overall performance of a data centre By Ash Ashutosh, CEO, Actifio.

Software-defined application services take data centres into new era

20 Apr 2015
By Gary Newe, F5 Networks.

Data security - top tips for senior managers

13 Apr 2015
Ian Kilpatrick, chairman Wick Hill Group, explains why senior managers now need to be more involved in data security and offers his top tips.

NHS Dumfries and Galloway adopts a proactive approach to safeguarding patient data

13 Apr 2015
Guarding against security threats with a state-of-the-art vulnerability management solution.

Making OpenStack enterprise ready with software defined availability

13 Apr 2015
By Jason Andersen, Senior Director, Product Management and Marketing at Stratus Technologies.

SDN – A brave new world underpinned by a traditional technology

13 Apr 2015
Derek Watkins, Vice President of Sales EMEA & India, for Opengear examines the key technologies and adoption trends of Software Defined Networking (SDN) and suggests that data centres getting r...

Bradford Teaching Hospitals completes one of UK’s largest image migration projects

13 Apr 2015
Bradford Teaching Hospitals NHS Foundation Trust (Bradford) has completed one of the UK’s largest image migration projects thanks to a four-way partnership alongside healthcare data managemen...

SteelFusion 4.0 enables 'zero IT' at the branch office

9 Apr 2015
With more powerful software and hardware, SteelFusion 4.0 consolidates all IT at the branch into the data center for instant branch provisioning and recovery, full security and visibility, and apps that simply work.

Lessons to be learned from Zero-day nightmares

6 Apr 2015
By Philip Lieberman, President & CEO Lieberman Software.

In the line of fire

6 Apr 2015
The consequences of a fire in a data centre can be catastrophic and, as well as causing expensive downtime, it can also result in irreparable damage to expensive equipment if not detected quickly. ...

The data centre network of the future

6 Apr 2015
By Roger Keenan, managing director of central London data centre, City Lifeline.

Recruitment

Latest IT jobs from leading companies.

 

Click here for full listings»