Yearly Archives: 2013

Global data centre investment up 22% in a year

Global data centre investment up 22% in a year

Global investment in data centres has grown 22% in the last year, from approximately $86 billion in 2011 to $105 billion in 2012, according to the latest findings from the DatacenterDynamics (DCD) 2012 Global Census.

The largest increase in investment (23%) was in the facilities management (FM) and mechanical and electrical (M&E) sectors, including such areas as uninterruptible power supplies (UPS), cooling equipment, and data centre infrastructure management (DCIM) systems. This was up from $40 billion in 2011 to $49 billion in 2012.

The IT equipment sector – including ‘active’ equipment such as servers, storage, switches and routers – showed slower growth at 17%, from $30 billion in 2011 to $35 billion in 2012.

The focus on the FM and M&E sectors is largely due to a massive increase in global data centre power demand. The report states that power consumption has grown 63% globally to 38 gigawatts (GW) over the last twelve months, with a further 17% forecast for 2013.

This is because the proportion of high density racks is increasing, so each rack is now consuming more power on average. The need for electrical power generation, distribution, UPS and cooling equipment in data centres has therefore grown as a result of these power increases.

In spite of this increase in power demand, however, concern about power availability and cost – both of which have been constant topics in the media and data centre professional groups in recent years – is actually down on a global basis.

“This is explained in part by the increasing representation amongst the sample of companies in less developed markets where power requirements are smaller and so less constrained than in mature markets,” said Nicola Hayes, managing director of DCD Intelligence.

“Also in part by efficiency and other strategies put in place by data centre companies over the past 12 months to mitigate against increased power costs and to overcome issues to do with availability.”

The news comes as Ofgem warns that Britain risks running out of energy generating capacity in the winter of 2015-16. A report by the UK energy regulator predicts that the amount of spare capacity could fall from 14% to only 4% in three years, leaving Britain relying more on imported gas.

Commenting on the news, Chris Smith, director at data centre management firm on365, said that inefficient data centres are a major contributor to this problem.

“Data centres consume between 2-3% of the UK’s total electricity and about half of this demand actually powers the computational equipment (servers and storage) with the rest being consumed by supporting infrastructure such as lighting and cooling. In addition, traditional data centres typically run only at around 10% efficiency,” he said.

Technology of the Year 2014: The best hardware, software, and cloud services

Technology of the Year 2014: The best hardware, software, and cloud services

A bumper sticker philosopher once said, “The best things in life aren’t things.” We can’t be sure the deep thinker had cloud computing in mind, but the same thought occurred to us as we drew up our list of best products for the annual Technology of the Year Awards. Long ago, you could press your server’s power button with your own hand, and software came in shrink-wrapped boxes. Now it’s all off in some netherworld, somewhere else, and we reach out to it with a URL instead of our fingers.

This isn’t completely true. There are several tangible objects on our list, but they’re mostly hardware that lives in backrooms away from grubby hands. Anyone who buys them immediately hides them away from everyone, so the machines won’t get hurt. The rest of the winners are pieces of software, many of which aren’t even sold as software, per se. They’re packaged as services, which are even more ephemeral and untouchable than the cloud servers they run on.

Let’s get the real things out of the way quickly. The winners list holds solid server-room machines — including the Dell PowerEdge VRTX and the HP 3PAR StoreServ 7400 — that will be great additions to the backrooms of every enterprise that hasn’t cast their worries aside to embrace the great cloud of the Internet. There are also appliances like the Riverbed Steelhead WAN accelerator and the Synology RackStation storage system. If you still believe in being able to touch your data, or at least the box holding it, then these machines will help you feel secure.

There’s also the Raspberry Pi, a little gift to hackers and makers everywhere that is bringing automation and embedded intelligence to places it’s never been before. The Pi is technically a real thing that you can reach out and touch, but it’s so small that you could sneeze and lose it. It’s also so affordable that you could afford to lose it — which makes it suitable for all sorts of interesting applications.

Cloud notionsThe rest of the list is composed of ideas, notions, and whimsies delivered through the Internet. Some might take issue with the whimsiness of Apple’s iOS 7. If there’s anything that’s meant to be seen and touched, it’s an operating system like iOS. This is a fair point, but the operating system is software that’s more and more an extension of the cloud. It’s downloaded from the cloud, and it interacts with the cloud. It’s really a portal to a bigger, ephemeral universe of services than a thing — which makes it all the more compelling.

We also included old favorites from Microsoft. Some people may remember Microsoft Office as an item you could touch because it arrived on plastic disks and came with license keys. Those days are fading, and now Office is available as a subscription service from the cloud, where it joins cloud-based versions of Exchange, SharePoint, and Lync. Microsoft Office 365 wraps them all together in a tightly integrated package for businesses of all sizes.

The link between computer and tangible box is growing ever more tenuous. Riverbed’s Granite renders real storage almost as ephemeral as the cloud by “projecting” it over the WAN from the data center to servers in the remote branch office. The storage lives in the data center, where it’s easy to back up and secure, but as far as branch office users are concerned, it’s accessible locally — at LAN speeds.

Two of the winners this year, the Nutanix NX-3000 virtualization appliance and the PernixData FVP “flash hypervisor,” enable simpler slicing and dicing of computing cycles so that one real box can pretend to be many virtual servers. Both products help you squeeze more virtual servers into less space — PernixData by clustering server-side flash to reduce I/O pressure on the SAN, and Nutanix by eliminating the SAN completely.

We also salute the technology that makes it possible to juggle all of these servers. Puppet and Salt are two packages that organize the creation and configuration of dozens, hundreds, or even thousands of servers be they real or virtual, in the data center or in the cloud. They spin up new machines, install software, add patches, and shut down the machines that aren’t needed. There are now so many of the “machines” that we require tools like these to keep everything straight.

App developmentsOur list is also filled with services for fashioning your own cloud services. GitHub lets you offload the job of organizing code. Cloud Foundry and CloudBees are two ways to avoid the tedium of standing up servers and configuring software stacks, instead letting you dive straight into the work of developing applications.

Some of the items are even more abstract. They made the list because they’re more than software packages — they’re political causes. Scala, the programming language, earned a spot on the list because it’s both a practical tool used to build sophisticated websites and a standard-bearer for the functional programming movement. As they say at the podium, this award goes to more than Scala itself. It goes to everyone who believes in the idea that programs are better expressed as functions without messy, bug-inducing side effects.

The success of this idea may be most evident in the awards given to the many JavaScript tools. While JavaScript is far from a functional language, it is often used in a way that is more and more what the functional disciples want. Node.js, for instance, is a nimble server framework that insists the application layer be written in functions responding to events. It’s a far cry from what the dogmatic functional programmers want, but its speed and agility is attracting everyone to the bandwagon.

Such JavaScript approaches are everywhere, and this is reflected in the winners list. There are the frameworks like Bootstrap, Ember.js, and D3, all designed to make creating Web applications in the browser as good as, if not better than, what we’ve come to expect from native code. All of them rely upon JavaScript’s functional callback paradigm to produce cleaner code.

We couldn’t mention all 35 of our winning things and not-things here, but those we glossed over are no less relevant. Take a tour through our winners slideshow, and we promise you’ll find something useful. It might be something you can touch, or something you have to grasp through a browser. Even if we can’t see or touch many of the things themselves, we know they’re wonderful to use and they’re amazing. That’s a definition of “real,” right?

Optimal Solutions Leadership talks about ideal employees

Optimal Solutions Leadership talks about ideal employees

sap_hana_speed-300x225Deriving actionable insights — correlations, trends, outliers, etc. — in real time from terabytes of big data — operational, transactional, structured, unstructured — is vital to success for businesses of all sizes across all industries in today’s increasingly global, hyper-competitive, always-on, Internet-of-everything business world.

Businesses lacking the capability to effectively leverage big data are at a marked disadvantage. According to Gartner, most organizations are ill prepared to address both the technical and management challenges posed by big data, and as a direct result of this, few will be able to exploit big data for competitive advantage. Putting a finer point on this observation, Gartner predicts that through 2015, 85 percent of Fortune 500 organizations will be unable to exploit big data for competitive advantage

Given that the majority of organizations have invested significantly in core backend systems — ERP, EIM, EPM, CRM, SCM, HCM, etc. — and upstream tasks such as data capture, storage, and cleansing, it is reasonable to conclude that this big data impasse can in large part be attributed to systems integration complexity, analytics skills gap, and lack of proficiency with big data visualization tools — three fronts on which SAP has made great strides over the past few years.

See Clearly Through Big Data

See Clearly Through Big Data

sap_hana_speed-300x225Deriving actionable insights — correlations, trends, outliers, etc. — in real time from terabytes of big data — operational, transactional, structured, unstructured — is vital to success for businesses of all sizes across all industries in today’s increasingly global, hyper-competitive, always-on, Internet-of-everything business world.

Businesses lacking the capability to effectively leverage big data are at a marked disadvantage. According to Gartner, most organizations are ill prepared to address both the technical and management challenges posed by big data, and as a direct result of this, few will be able to exploit big data for competitive advantage. Putting a finer point on this observation, Gartner predicts that through 2015, 85 percent of Fortune 500 organizations will be unable to exploit big data for competitive advantage

Given that the majority of organizations have invested significantly in core backend systems — ERP, EIM, EPM, CRM, SCM, HCM, etc. — and upstream tasks such as data capture, storage, and cleansing, it is reasonable to conclude that this big data impasse can in large part be attributed to systems integration complexity, analytics skills gap, and lack of proficiency with big data visualization tools — three fronts on which SAP has made great strides over the past few years.

IT Predictions 2014 and Beyond

IT Predictions 2014 and Beyond

sap_hana_speed-300x225In today’s IT world, innovation moves at a breakneck speed. IT pros know only too well that resting on their laurels is a recipe for disaster.

More than any other large software vendor, SAP has been innovating to beat the band. And we’re not talking about minor tweaks to staid products. SAP’s ‘intellectual’ renewal, sparked by HANA but permeating every aspect of the company’s product portfolio, has been in full swing for several years now, and its pace of innovation shows no signs of slowing — from in-memory technology, to Cloud delivery to advanced analytics/BI to its tenacious grasp on a ‘mobility first’ strategy, SAP is leading the way in the rapid evolution of enterprise technology.

From the core to the edge and all points in between, SAP is driving tectonic business transformation and driving measurable and meaningful real-world results. Perhaps more than ever before, SAP consultants are challenged with keeping abreast of the very latest technological developments — and more critically, helping SAP customers understand what this tsunami in product advances means to them now and down the road. So what will be the top IT trends in 2014 and beyond?

To help SAP consultants stay on top of their game (and earning potential), we’ve pulled together a roundup of the latest pundit predictions:

Hello world!

[wptabs mode="vertical"]

[wptabtitle]Name of the tab[/wptabtitle]
[wptabcontent]Contents of the tab[/wptabcontent]

[wptabtitle]Name of the tab[/wptabtitle]
[wptabcontent]Contents of the tab[/wptabcontent]
[wptabtitle]Name of the tab[/wptabtitle]
[wptabcontent]Contents of the tab[/wptabcontent]

[/wptabs]