Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.
Which of today's newest shipping technologies will cast the longest shadow over business computing? Here are our best guesses
10 emerging technologies that will shape IT's future
Everyone is a trend watcher. But taking a hard look at the technologies that gave life to the latest buzz phrases is the only way to determine which trends will actually weave their way into the fabric of business computing.
Here at InfoWorld, we're every bit as excited about big changes in the direction of enterprise IT, from the consumerization of IT to infrastructure convergence. But what vapor-free technologies have actually emerged to enable these IT strategies to take shape, and more importantly, which will cement these changes in your IT department in the years to come?
Among the technologies shipping but not yet widely adopted, we see the following 10 having the greatest impact over the long haul. Get to know them.
Naysayers point out that we've been putting tags together to form Web pages since the beginning of the World Wide Web. HTML5 has simply added new ones. But while HTML5 looks similar to old-fashioned HTML, the tasks it accomplishes are dramatically different. Local data storage, , and make it possible to do much more than pour words and images into a rectangle. Plus, the new HTML5 WebSockets spec defines a new way to conduct full-duplex communication for event-driven Web apps. And with Adobe's decision to end development of mobile Flash, suddenly an entire sector of the Web development industry is going to retool as we move to HTML5 from Flash. And that represents a tectonic shift for Web developers.
9. Client-side hypervisors
Desktop virtualization has faltered for two reasons: It requires continuous connection between client and server, and a beefy server to run all those desktop VMs. Client hypervisors solve both problems. Install one on an ordinary machine and leverage the processing power of the client. Laptop users can take a "business VM" with them containing the OS, apps, and personal configuration settings. That VM is secure and separate from whatever else may be running on that machine -- including malware accidentally downloaded -- and you get all the virtualization management advantages, including VM snapshots, portability, and easy recovery. Client hypervisors point to a future where we bring our own computers to work and download or sync our business VMs to start the day.
8. Continuous build tools
The more collaboratively minded developers among us like the way continuous build tools like Jenkins, Hudson, and other "continuous integration" servers help us work together for the betterment of the whole. These tools put all code through a continuous stream of tests, alerting developers about problems with code they checked in some 10 seconds ago, keeping everybody moving toward the same goal. Tools like Hudson or Jenkins aren't new because there have been a number of slick proprietary continuous integration tools for some time, but the emergence of open source solutions encourages the kind of experimentation and innovation that comes when programmers are given the chance to make their tools better.
7. Trust on a chip
Assuring security at the highest application level requires verification at every layer, including the physical construction of the computing device. Enter trust on a chip. The TPM (Trusted Platform Module) from the TCG (Trusted Computing Group) was the first popularly adopted hardware chip to assure trusted hardware and boot sequences. Last year, Intel combined the TPM chip and a hardware hypervisor layer to protect boot sequences, memory, and other components. Any software vendor can take advantage of it. Hardware trust solutions aren't perfectly secure, as the Princeton memory freeze and electron microscope attacks showed, but they beat software-only protection solutions. The hardware protection schemes will only get better. Soon enough, every computer device you can use will have a hardware/software protection solution running.
5. Distributed storage tiering
Vastly faster than disk and many times cheaper than DRAM, NAND flash memory is a hot commodity that will be even hotter when storage management software catches up with its potential in the data center. Its combination of high speed and low cost makes it excellent for server-side cache and a natural choice for tier-one SAN storage. With the cost of flash dropping and the capacities of SSDs on the rise, the days of disk drives in servers and SANs appear to be numbered. The best part: Flash storage will enable server-side storage to be managed as an extension of the SAN, storing the most frequently accessed or I/O-intensive data closer to the app. It's like caching, but smarter and more cost-effective.
4. Apache Hadoop
Hadoop breaks new ground by enabling businesses to deploy clusters of commodity servers to crunch through many terabytes of unstructured data -- simply to discover interesting patterns to explore, rather than to start with formal business intelligence objectives. Tools like Apache Hive and Apache Pig have made exploiting Hadoop easier for developers, and the evolution of the Hadoop ecosystem points to further ease and deeper insights in the years to come. As Hadoop solutions proliferate, businesses will better be able to predict the behavior of Web customers, optimize workflows, and discover patterns in everything from medical histories to common search terms. The best thing about the new wave of Hadoop analytics is that we're only beginning to discover where it may lead.
3. Advanced synchronization
Apple and Microsoft agree on one thing: It's time to say goodbye to single-user environments, where each user device is a separate island from the rest of the user's computing world. Apple paved the way with iOS and iCloud, introducing a cloud-based syncing service across devices. Microsoft's Windows 8 takes the concept further, keeping not just data but application state in sync. This shift will drastically change how people work on computers, giving applications dramatic new utility. Automatic data syncing coupled with context, such as location, available input methods, connectivity, and sensor-driven data, will give rise to truly user-centric computing, profoundly altering how IT approaches applications, security models, and other tech policies and strategies centered on user productivity.
2. Software-defined networks
Data center networks have grown calcified over time. While servers and storage have benefited from software abstractions, networks have remained hardware-bound and static, making them a major roadblock to cloud computing. Enter SDN (software-defined networking), which drapes a software layer over switch and router hardware to serve as both a centrally managed control plane and a platform for innovation. SDN isn't network virtualization, rather it is a way to "program the network" -- that is, it allows cloud providers and ISVs to build new networking capabilities the rest of us can draw on. The leading example of SDN today is OpenFlow, the brainchild of university researchers who wanted to experiment with new network protocols on large production networks.
1. Private cloud orchestration
With a private cloud, IT managers can borrow technologies pioneered by public cloud providers and apply them to their own data center. These clouds have many moving parts -- virtualization management, chargeback systems, self-service provisioning -- hence the need for orchestration. Open source project OpenStack has gained considerable momentum offering a core set of cloud orchestration services. Eucalyptus is another alternative, offering essentially a private cloud implementation of Amazon Web Services. It may be easy to be cynical about any technology attached to the term "cloud," but no one questions the benefits of pooling resources for greater economies of scale. Paradigm changes demand new ways of working -- and the emerging collection of cloud orchestration software supplies the means.
ARN Innovation Awards
Women in ICT Awards