Over at TechCrunch, Josh Constine reports that @WalmartLabs has acquired OneOps to increase its Infrastructure-as-a-Service (IaaS) offerings. The retail giant also procured social software developers Tasty Labs in a related move.
OneOps developed a Platform-as-a-Service (PaaS) capability that Walmart explains will enable it to “significantly accelerate” its PaaS and Private Cloud Infrastructure-as-a-Service (IaaS) strategies. The company offered developer tools built from the ground up for those who host their applications on cloud services like Amazon Web Services, for example, as well as Rackspace and HP Cloud. Developers could publish to any cloud and seamlessly port their apps elsewhere as needed, eliminating lock-in.
At Information Week, Charles Babcock reports that Google is no longer employing its custom version of Linux instead opting for the open-sourced Debian.
In moving to Debian, Google is demonstrating that it wants Google Compute Engine to become less Google-technology specific and more of a standard platform. Compute Engine’s predecessor, App Engine, a developer’s platform as a service, restricted itself to Google’s favorite language, Python, at its launch. Compute Engine workloads based on Debian means the favored operating system will be supported by a community larger than Google’s development team itself.
Over at Wired, Vish Ganapathy reports that Big-box retailers are using Big Data analytics hosted in clouds to learn more about their customers and to compete with the e-commerce segment.
Cloud computing involves a new way of thinking about data. In a cloud, a single server can host many virtual servers, slashing hardware costs. The virtual servers can scale on demand depending on the need for computer capacity. That’s very useful for retailers, whose businesses are notoriously seasonal. Automatically expanding capacity on Black Friday, for example, can reduce lines at checkout counters and ensure quick service.
Over at Power Engineering, LS Subramanian writes that Big Data and the use of cloud technology will be vitally important in meeting the world’s energy needs down the road.
As the cost of energy increases and its availability decreases there is an extensive use of collating data in the discovery, extraction, processing and transmission and distribution of energy. The energy business is increasingly using Big Data and cloud computing to ensure efficiency and cost effective solutions.
Over at Talkin’ Cloud, Chris Talbot writes that IBM’s acquisition of UrbanCode will help Big Blue increase its presence in Cloud, Big Data and other areas as well.
UrbanCode’s technology promises the ability for organizations to reduce the cycle time of getting updates or new applications deployed from days or months to minutes, all the while keeping risks at a minimum and reducing costs. In the end, the goal is to provide end users with an overall improvement in the quality of applications and services.
Over at GigaOM, Derrick Harris writes that Cloud provider Joyent has a new Hadoop offering that the company claims can outperform most others on the market.
Joyent’s Hadoop service is based on the Hortonworks Data Platform (as are the Microsoft and Rackspace offerings) and — according to Joyent — runs three times faster than some other cloud-based Hadoop services. This is so, according to Joyent CTO Jason Hoffman, in part because Joyent’s cloud architecture is highly persistent (i.e., storage and compute are co-located and non-ephemeral), which means Joyent can bring the MapReduce processors to the data. In other cloud environments, data — potentially lots of it in the case of Hadoop jobs — might have to traverse a network in order to reach the Hadoop processors, running into variable performance and issues along the way.
In this video, Andrew McLaughlin of Betaworks presents: The Fight for the Future: The Internet, Censorship, Surveillance, and You.
How is Moore’s Law, ever-cheaper computing, and interconnectedness affecting our world? Activists, individuals, and governments are using digital technologies like social media as powerful forces for change.
From 2009-2011, Andrew McLaughlin was a member of President Obama’s senior White House staff, serving as Deputy Chief Technology Officer of the United States. In that role, Andrew was responsible for advising the President on Internet, technology, and innovation policy, including open government, cybersecurity, online privacy and free speech, spectrum policy, federal R&D priorities, entrepreneurship, and the creation of open technology standards and platforms for health care, energy efficiency, and education.