Google and Microsoft continue to chase each other’s tails. Public sector email was in focus yesterday, today its Google Apps which has just received a government security certification. Meanwhile IGC has joined the XML project for Electronic Discovery.
Google Gets US Security Cert
Well looks like Google has made some progress on this front already as it announced yesterday morning that Google Apps had been certified as meeting the security requirements of the US government.
The certified apps are contained in a new version of Google Apps productivity suite called appropriately, Google Apps for Government. It contains the core apps, including Gmail, Docs, Video, Calendar and Postini with a price the same as Google Apps of US$ 50 per user per year.
The new certification, which gives Google Apps a FISMA-Moderate rating, means that it can be used to send data that is considered sensitive but not classified using servers that are isolated from the ones used by private sector clients.
The certification that Google Apps qualifies for is called a FISMA-Moderate rating, which means that it's authorized for use with data that's sensitive but unclassified.
While it gives Google added compliance and risk kudos, it will only add to the Microsoft v Google stand-off -- Microsoft is already reportedly seeking FISMA certification for its Business Productivity Online Suite.
While the certification is significant for Google it still has a long way to go before it will be able to put Microsoft back in its box.
McAfee Upgrades Risk Management Solution
For risk management, McAfee has just unveiled a new solution for IT Security. The new solution combines real-time threat intelligence with global vulnerability scanning across applications, databases and networks, and correlates that with security countermeasures already in place
The result, McAfee (news, site) says, is that it helps organizations assess their highest priority risks. Consisting of McAfee Risk Advisor 2.5, McAfee Vulnerability Manager 7.0 and MacAfee Vulnerability Manager for Databases, it automatically pulls information together into a real-time risk analytics engine.
This enables organizations to fully understand their risk posture across every meaningful sector of their IT environment including databases, web applications, systems and networks to know precisely what is needed to optimize their security posture.
All products are available now. If you want to know more check out the McAfee Web GRC page age at the website
Cyber Attacks Cost US$ 3.8m Each
If you’re trying to get some kind of idea of how much a cyber attack will actually cost you, a new report from the Ponemon Institute (news, site) analyzing cyber- attacks against 45 companies shows the cost to businesses to be an average of US$ 3.8 million.
The study covered organizations with 500 or more seats, and was conducted during a five-month period ending June 23. The findings show:
- It took an average of 14 days to resolve a cyber-attack, with an average cost of US$ 17,696 a day. Malicious insider attacks can take up to 42 days or more to resolve.
- Malicious insider attacks cost an average of US$ 100,300 a day.
- The most expensive are Web-based attacks, which cost US$ 143,209 per day.
- On an annualized basis, detection and recovery account for a combined 46 percent of the total internal activity cost, with human resources representing the majority of the price tag.
The First Annual Cost of Cyber Crime Study was conducted in early 2010 from a survey of 45 U.S. organizations representing a cross section of markets. The study focused on the direct, indirect and opportunity costs that resulted from loss or theft of information, disruption to business operations, revenue loss and destruction of property. Find out more in the full report.
Data Loss and Human Error
Another piece of recent research that might be worth a look and which shows that no amount of software is going to completely protect your company from data loss is from legal software and services provider Kroll Ontrack.
Kroll Ontrack asked more than 2,000 participants from 17 countries across North America, Europe and Asia Pacific to explain the cause of their most recent data loss. Of them:
- 40 percent of respondents believed that human error was a leading cause.
- Only 27 percent said they could actually attribute a recent loss to human error.
- 29 percent identified hardware/system failure as the cause of their most recent data loss, which was reported as the cause of data loss by 56 percent of respondents in a similar survey in 2005.
Interestingly, and which could be a real case of head-in-sand, data loss attributed to computer viruses and natural disasters remained fairly low. In the 2010 survey, computer viruses accounted for less than 7 percent of recent data loss incidents, compared to 4 percent in 2005. Similarly, natural disasters were responsible for 3 percent of incidents in 2010, versus 2 percent in 2005.
Is this actually the case, or are there awareness issues here. In any case if you want to read more check it out.
IGC Joins EDRM
Finally, IGC (news, site) which provides collaboration and redaction software has just announced that it has joined the Extensible Markup Language (XML) Project of The Electronic Discovery Reference Model (EDRM).
This is an industry group created to establish practical guidelines and standards for electronic discovery e-Discovery.
The goal of the EDRM XML Project is to provide a standard, generally accepted XML schema to facilitate the movement of electronically stored information (ESI) from one step of the electronic discovery process to the next, from one software program to the next, and from one organization to the next.
The Electronic Discovery Reference Model (EDRM) Project was launched in 2005 to address the lack of standards and guidelines in the electronic discovery market. The completed reference model provides a common, flexible and extensible framework for the development, selection, evaluation and use of the electronic discovery products and services. For more information, visit the website.