In an effort to better support organizations trying to get a grip on Big Data, Oracle is releasing several new versions of hardware and software applications.
These upgraded applications include Oracle Big Data Appliance X3-2 and Big Data Connectors. Big Data Appliance X3-2 hardware/software package offers 8-core Intel Xeon E5-2600 processors and 18 compute and storage servers with 648 TB raw storage encompassing 288 CPU cores and 1.1 TB of main memory. It also includes the latest release of Cloudera’s Distribution including Apache Hadoop (CDH) and Cloudera Manager, as well as the new Oracle Enterprise Manager plug-in for Big Data Appliance. In addition, Big Data Appliance X3-2 offers updated distributions of Oracle Linux, Oracle Java Development Kit and open source R.
Meanwhile, the latest iteration of the Oracle Big Data Connectors software suite, which integrates Apache Hadoop with Oracle Database, Data Integrator and R Distribution, includes new features designed to extend its data integration capabilities, such as enhanced automation and Hive table querying for the Oracle SQL Connector for Hadoop Distributed File System, as well as increased access to Hadoop from the R environment.
And version 2.0 of Oracle NoSQL Database, which is included as part of Big Data Appliance X3-2, now stores and retrieves large objects and automatically re-allocates storage resources as production data processing requirements change. Integration with both the Oracle database and Hadoop environments is also tightened.
Big Data Becomes Value Add
Oracle has beefed up its Big Data Applicance significantly since the initial release in January 2012. Anyone wondering why Oracle is putting so much effort into Big Data storage and processing should consider the information given in a presentation at the recent Gilbane 2012 Conference in Boston — “Big Data for Enterprise and Marketing Applications — Three Views,” Stefan Andreasen, founder/CTO of Kapow Software, explained why Big Data is becoming such a value-add for enterprises.
According to Andreasen, 90% of all data on the planet has been created in the past two years. Andreasen said the trick to obtaining the value of Big Data is to focus on the data that is relevant and important to you and is also recent. “If data is too old, it’s irrelevant,” he said.
Sitecore Analytics Director Ron Person described how new volumes of Big Data will require new analytical methods. “Eighty to 90% of Big Data is unstructured,” said Person. “We are at the petabyte level, which is 10 followed by 15 zeroes. WalMart creates 50 million filing cabinets worth of data every hour.”
With this startling volume of Big Data and the need for plucking recent, relevant information out of it, Oracle’s decision to focus huge amounts of processing and storage capability on Big Data becomes much easier to understand.
Keeping Up With the Joneses
Commentary in EWeek supports the notion that Oracle’s latest, greatest Big Data effort is basically the Oracle's attempt to keep up with enterprise technology rivals, such as Microsoft and HP, that are busy trying to prove to the market they offer the best Big Data value-add. “Naturally, the company claims that these latest versions provide more processing power, memory capacity, enhanced integration and management capabilities than previous products,” states an EWeek article on the new Oracle Big Data releases. “Oracle — nor any other Silicon Valley IT company — is certainly not going to claim anything less. Results, of course, will vary within existing systems.”
- Office 365 is a Disaster Waiting to Happen
- Who Leads in Multichannel Campaign Management?
- 4 Reasons ECM Needs To Go Digital
- Windows 10's New Mail: Is It Outlook or Isn't It?
- Don't Hold Your Breath: SharePoint Release Delayed
- 8 Tips to Spring Clean Your Digital Work Life
- Cisco Launches Knowledge Sharing and Learning Platform