Thursday, 20 June 2013

Ten rules for building clouds

Ten rules for building clouds

1. Aim for 100 percent automation provisioning: Part of the reason for installing a cloud is because you want to speed provisioning of new compute power. Putting in authorization check points slows down this process. Sure a cost center owner may need to confirm the cost of the new compute power, but that aside, everything should happen automatically wherever possible.
2. Aim for 100 percent automated testing of new/revised catalog entries: Cloud catalogs contain a list of types of compute power (Linux, Red Hat, Windows) and application add-ons (accounting software, analytics software) that users want. The IT function will have populated that catalog after exhaustive testing. But things change and that catalog will need to be kept up to date using automated testing techniques to handle new releases. That way the testing is consistent and less onerous. This helps to reduce the support costs and protects the enterprise. Automate the deployment of patches and fixes to the deployed systems in the cloud, too.
3. Reuse “Lego-like” building blocks using SOA concepts to build the cloud catalog: If you have more than one catalog entry that requires Windows 7 as the operating system, then try to have only one Windows 7 image in your catalog with constructed workflows that add the applications on top. That way you have a smaller number of components to manage and keep up to date, reducing your costs in the process.
4. Design your cloud to help transform your business: Cloud computing is about reducing costs and making things happen. So instead of waiting weeks – or months – to get new compute power installed, the wait is minutes or hours. That means users have far more power and control on how the power they need is accessed. Business users have another tool at their disposal and therefore the role of IT changes. How this is all implemented takes thought. Without it, cloud is just another IT project that has limited value. Form the cloud vision early and manage it.
5. Get cloud governance up and running early: The cloud vision – and the benefits it can realize – need to be owned by the organization. So governance needs to be in place early on in the development phase to ensure that the vision is true and achievable, and that changes in requirements or the solution are properly assessed and accepted. When the cloud is live, this governance should ensure that it is managed properly using measures in the form of Key Performance indicators (KPIs) and change control to keep the cloud true to a vision.
6. Do not automate manual processes: In the non-cloud world, there will be various processes with manual steps and authorizations required to provide new compute power. All of this takes time and money. In the cloud, none of these real-world constraints exist. So take the time to step back and really work out what is needed from a process point of view. The challenge is to have at most no more than one manual authorization step, for provisioning compute power. Make it as fast and as snappy as possible to provide a fantastic and responsive service to the business users
7. Only monitor, report and manage things that matter: Cloud governance processes will manage the cloud for the benefit of the organization. It will need information to do that, matched to the KPIs. But only measure the minimum to enable both governance and systems management. Do not put huge amounts of effort into measuring things that have no value in the management of the cloud.
8. The cloud is self documenting: with physical things in the non-Cloud world, documentation and records need to be kept of what is where, as well as what is connected to what. Most cloud management software provides a lot of reporting facilities which the cloud uses to effectively document itself. Therefore, there is little value in duplicating these features and spending lots of effort in keeping records outside of the cloud up to date. Let the cloud do it for you and use the power of the built in features as much as possible.
9. Clouds are used by business users who should be protected from technical detail: Business users are good at running the business and not that knowledgeable about IT. IT people are good at managing IT but not at managing the business. So set the cloud up to use common language rather than jargon. This is so that business users do not need to understand the technical detail of the cloud. This is particularly true of the cloud catalog where the entries for selection by business users need to be readily understandable.
10. Use out of the box features as much as possible: It is tempting to think that the cloud should provide some features you deem more desirable than anything else. But proceed with caution. Any add-ons or changes you make will reduce the ease of updating the cloud software when the vendor releases updates. Similarly a lot of effort – and expense – will be used to adapt the cloud which delays the return of investment and pushes that point further out. These extras mean retaining (potentially) expensive knowledge in the enterprise, at a cost. So use as many out of the box features as possible and resist the urge to tweak, extend and replace.

Tuesday, 18 June 2013

Accel Closes $100M For Big Data Fund 2 To Invest In The ‘Second Wave’ Of Big Data Startups

The tech industry has been buzzing about “big data” for years now. And according to venture capital firm Accel Partners, the excitement around the big data space is not set to die down any time soon — it’s just about to enter into a new phase.
Accel is announcing tonight that it has closed on $100 million for a new investment fund called Big Data Fund 2. The fund is the same size as Accel’s first big data focused fund, which launched with $100 million back in November 2011.
As part of the new fund, Accel is also adding QlikView CTO Anthony Deighton and Imperva CEOShlomo Kramer to its Big Data Fund Advisory Council, which Accel has said is meant to serve as a “guiding light” to help think through investments and track entrepreneurs doing interesting things in the space.
Despite the nearly identical name, Accel’s Big Data Fund 2 will mark a definite shift in focus from the firm’s first big data fund, partner Jake Flomenberg said in a phone call today. “Over the past few years, we’ve focused a tremendous amount of attention on what people like to call the ‘three Vs’ of big data: variety, volume and velocity,” he said. “We now believe there is a fourth V, which is end user value, and that hasn’t been addressed to the same extent,” and that is where Big Data Fund 2 will be focusing the bulk of its investment and attention.
Specifically, Accel believes that “last mile” for big data will be served largely by startups focused on data-driven software, or “DDS.” These startups have largely been made possible through the hardware and infrastructure technology innovations that defined big data’s first wave, Flomenberg says. In a prepared statement from Accel, Facebook engineering VP Jay Parikh, who also serves on Accel’s Big Data Advisory Council, explained it like this:
“The last mile of big data will be built by a new class of software applications that enable everyday users to get real value out of all the data being created. Today’s entrepreneurs are now able to innovate on top of a technology stack that has grown increasingly powerful in the last few years – enabling product and analytical experiences that are more personalized and more valuable than ever.”
One example Flombenberg pointed to as an example of a “fourth V” DDS startup is RelateIQ, the “next generation relationship manager” software startup which launched out of stealth last week with some $29 million in funding from Accel and others.
Accel’s existing portfolio of big data investments also includes Cloudera, Couchbase, Lookout, Nimble Storage, Opower, Prismatic, QlikView, Sumo Logic, and Trifacta.

Sunday, 9 June 2013

How the feds are using Silicon Valley data scientists to track you

How the feds are using Silicon Valley data scientists to track you
Tech companies like Facebook, Apple, and Google are not the only ones helping U.S. intelligence agencies track citizens.
For years, data scientists have been brought in to brief with the National Security Agency.
The NSA has a massive team of analysts and a huge wiretapping program called PRISM, but it is eager to take advantage of the newest “big data” and machine learning technologies, so it can more easily make sense of millions of phone calls, emails, and text messages.
The goal is to track suspicious activity and create a complex “alerts system” for acts of terrorism, said Sean Gourley, a data scientist and founder of Silicon Valley-based Quid, which provides big-data analysis services, mostly for government customers.
Some of the most innovative technology designed to cope with massive data streams has come out of Silicon Valley. For this reason, the CIA’s venture arm, In-Q-Tel, has an office on the Palo Alto’s famous Sand Hill Road. It makes strategic investments in “big data” startups, like Recorded Future, whose products may come in useful for various government agencies.
“The NSA is naturally interested in data mining; I know of data scientists in Silicon Valley who have helped them,” said Mike Driscoll, chief executive of Silicon Valley-based big data startup Metamarkets.
“They appeal to our sense of patriotism,” said Driscoll.
Driscoll was not surprised by today’s news exposing the government’s PRISM program, which caused a furor among civil liberties activists and the media. He referred to the Echelon Project, the NSA’s clandestine data mining project and spy program that we’ve known about for years, as a precedent.
To recap: The Washington Post reported today that tech companies are participating in a top secret data mining program for the FBI and NSA, dubbed PRISM. Since the news broke, the companies named in the report have almost universally issued statements to the press that they do not provide direct access to their servers.
However, the government is a third party. Facebook’s terms of service, for instance, state that it can share your information with third parties. The assumption most Facebook users make is that the wording refers to marketers or advertisers, not the government.
“We don’t mind little bits of manipulation, but we do mind if it’s on this scale,” said Gourley.
According to Gourley, who regularly works with federal agencies, the NSA is most interested in real-time systems for data analysis. It’s not just what you say — but who you know. In other words, you’ll be flagged if you’ve communicated with a person of interest, or if you share a suspicious tweet.
“The NSA is essentially looking for a needle in a massive, massive haystack,” he said.
Given that technology exists for sophisticated analysis of social networks, “you could be on the list by association,” he warns.

Thursday, 6 June 2013

VMware unveils vCloud Hybrid Service

VMware has revealed its VMware vCloud Hybrid Service, an infrastructure as a service (IaaS) platform.
“VMware’s mission is to radically simplify IT and help customers transform their IT operations,” said Pat Gelsinger, CEO of VMware.
“Today, with the introduction of the VMware vCloud Hybrid Service, we take a big step forward by coupling all the value of VMware virtualisation and software-defined data centre technologies with the speed and simplicity of a public cloud service that our customers desire.”
vCloud Hybrid Service will extend VMware software, currently being used by hundreds of thousands of customers, into the public cloud. This means customers will be able to utilise the same skills, tools, networking and security models across both on-premise and off-premise environments.
“As a source of competitive advantage for our international business, our operations and IT department needs the agility and efficiency the public cloud promises,” says Julio Sobral, senior VP of business operations at Fox International.
“However, we don’t have the luxury of starting from scratch; we see in the vCloud Hybrid Service a potential solution to enable Fox International to have a more elastic platform that will support future deployments around the world. Working with technology partners like VMware gives us the best of both worlds by extending our existing infrastructure to realise the benefits of public cloud.”
According to the company, the vCloud Hybrid Service will allow customers to extend their data centres to the cloud and will support thousands of applications and more than 90 operating systems that are certified to run on vSphere. This means customers can get the same level of availability and performance running in the public cloud, without changing or rewriting their applications.
Built on vSphere, vCloud Hybrid Service offers automated replication, monitoring and high availability for business-critical applications, leveraging the advanced features of vSphere, including VMware vMotion, High Availability and vSphere Distributed Resources Scheduler.
“Our new VMware vCloud Hybrid Service delivers a public cloud that is completely interoperable with existing VMware virtualised infrastructure,” said Chris Norton, regional director at VMware for southern Africa.
“By taking an ‘inside-out’ approach that will enable new and existing applications to run anywhere, this service will bridge the private and public cloud worlds without compromise.”
According to VMware, the vCloud Hybrid Service will be available this month through an early access programme.

Monday, 3 June 2013

The Real Reason Hadoop Is Such A Big Deal In Big Data

The Real Reason Hadoop Is Such A Big Deal In Big DataHadoop is the poster child for Big Data, so much so that the open source data platform has become practically synonymous with the wildly popular term for storing and analyzing huge sets of information.
While Hadoop is not the only Big Data game in town, the software has had a remarkable impact. But exactly why has Hadoop been such a major force in Big Data? What makes this software so damn special - and so important?
Sometimes the reasons behind something success can be staring you right in the face. For Hadoop, the biggest motivator in the market is simple: Before Hadoop, data storage was expensive. 
Hadoop, however, lets you store as much data as you want in whatever form you need, simply by adding more servers to a Hadoop cluster. Each new server (which can be commodity x86 machines with relatively small price tags) adds more storage and more processing power to the overall cluster. This makes data storage with Hadoop far less costly than prior methods of data storage.

Spendy Storage Created The Need For Hadoop

We're not talking about data storage in terms of archiving… that's just putting data onto tape. Companies need to store increasingly large amounts of data and be able to easily get to it for a wide variety of purposes. That kind of data storage was, in the days before Hadoop, pricey.
And, oh what data there is to store. Enterprises and smaller businesses are trying to track a slew of data sets: emails, search results, sales data, inventory data, customer data, click-throughs on websites… all of this and more is coming in faster than ever before, and trying to manage it all in a relational database management system (RDBMS) is a very expensive proposition.
Historically, organizations trying to manage costs would sample that data down to a smaller subset. This down-sampled data would automatically carry certain assumptions, number one being that some data is more important than other data. For example, a company depending on e-commerce data might prioritize its data on the (reasonable) assumption that credit card data is more important than product data, which in turn would be more important than click-through data.

Assumptions Can Change

That's fine if your business is based on a single set of assumptions. But what what happens if the assumptions change? Any new business scenarios would have to use the down-sampled data still in storage, the data retained based on the original assumptions. The raw data would be long gone, because it was too expensive to keep around. That's why it was down-sampled in the first place.
Expensive RDBMS-based storage also led to data being siloed within an organization. Sales had its data, marketing had its data, accounting had its own data and so on. Worse, each department may have down-sampled its data based on its own assumptions. That can make it very difficult (and misleading) to use the data for company-wide decisions.

Hadoop: Breaking Down The Silos

Hadoop's storage method uses a distributed filesystem that maps data wherever it sits in a cluster on Hadoop servers. The tools to process that data are also distributed, often located on the same servers where the data is housed, which makes for faster data processing.
Hadoop, then, allows companies to store data much more cheaply. How much more cheaply? In 2012, Rainstor estimated that running a 75-node, 300TB Hadoop cluster would cost $1.05 million over three years. In 2008, Oracle sold a database with a little over half the storage (168TB) for $2.33 million - and that's not including operating costs. Throw in the salary of an Oracle admin at around $95,000 per year, and you're talking an operational cost of $2.62 million over three years - 2.5 times the cost, for just over half of the storage capacity.
This kind of price savings mean Hadoop lets companies afford to hold all of their data, not just the down-sampled portions. Fixed assumptions don't need to be made in advance. All data becomes equal and equally available, so business scenarios can be run with raw data at any time as needed, without limitation or assumption. This is a very big deal, because if no data needs to be thrown away, any data model a company might want to try becomes fair game.
That scenario is the next step in Hadoop use, explained Doug Cutting, Chief Architect ofCloudera and an early Hadoop pioneer. "Now businesses can add more data sets to their collection," Cutting said. "They can break down the silos in their organization."

More Hadoop Benefits

Hadoop also lets companies store data as it comes in - structured or unstructured - so you don't have to spend money and time configuring data for relational databases and their rigid tables. Since Hadoop can scale so easily, it can also be the perfect platform to catch all the data coming from multiple sources at once.
Hadoop's most touted benefit is its ability to store data much more cheaply than can be done with RDBMS software. But that's only the first part of the story. The capability to catch and hold so much data so cheaply means businesses can use all of their data to make more informed decisions. 

Thursday, 7 March 2013

BRIC nations lag in cloud computing: Study

Brazil, Russia, India and China still lag far behind developed countries in policies considered critical for the future of cloud computing, but each made some progress over the past year, a US industry group said.

The Business Software Alliance, which represents US industry heavyweights such as Microsoft, said the BRIC nations all came in at the bottom half of 24 countries surveyed in its second annual cloud computing report.

Brazil moved from final position to 22nd with a tally of 44.1 out of a possible 100 points.

China, India and Russia each also rose two slots with scores of 51.5, 53.1 and 59.1, respectively.

Cloud computing refers to providing software, storage, computing power and other services to customers from remote data centers over the Web.

Demand for cloud-based software is rising rapidly because the approach allows companies to start using new programs faster and at lower cost than traditional products that are installed at a customer's own data center.

"The cloud is really the hot sector of IT right now," and US companies have a big interest in countries harmonizing policies instead of chopping the cloud into pieces, said Robert Holleyman, president of the Business Software Alliance.

At the same time, the aggregation of massive amounts of data in large data centers "creates new and highly tempting targets" for cyber attacks, making it vital that both law enforcement officials and cloud providers have adequate tools to fight the intrusions, the BSA report said.

"Australia, France, Germany, and Japan score extremely highly in the cybercrime section. Canada, China, (South) Korea, Russia, and Vietnam score poorly. The country that shows the most improvement is Brazil, which finally passed cybercrime laws after a long campaign," the report said.

The 24 countries included in the survey represent 80 per cent of the global information and communications technology industry. They were assessed in seven areas, including data privacy, security, free trade, intellectual property protection, infrastructure and support for industry-led standards to promote smooth data flows.

China got a small boost in this year's rating for introducing new data privacy laws, while Russia got credit for reforms made as a result of its entry into the World Trade Organization. India's improved score reflects changes to its copyright laws to bring them in line with international standards, the report said.

Japan came in first again with 84.1 points. It was followed closely by other developed countries, including Australia, the United States, Germany, Singapore, France, Britain and South Korea, which all scored in the upper 70s.

Singapore jumped to fifth place, from tenth last year, after it passed a new data privacy law praised by BSA for its "light touch" and balanced approach.

"They are really taking on digital trade as another way of putting a stake in the ground and to say they are going to be global hub of business," Holleyman said.

The United States finished second in the survey, up from third in the inaugural report, while Germany, France and Britain each slipped a notch and Italy fell four spots.

Holleyman said the European Union was working on data protection regulations that could potentially make it harder to move data across its borders.

"If that happens I think you can continue to see further sliding by the major European countries," Holleyman said.

Talks on a US-EU free trade agreement are expected to start by June, he said.

Cross-border data flows are already a focus in talks on the Trans-Pacific Partnership (TPP), a proposed regional free trade agreement between the United States and ten other countries in the Asia-Pacific slated for conclusion this year.

One of the TPP countries, Vietnam, finished last in this year's cloud computing scorecard, with a tally of 40.1 points.

Vietnam, Indonesia, China and India have pursued policies that threaten to divide the cloud, either by trying "to wall themselves off or by imposing local requirements that are antithetical to the very underpinning of cloud computing," Holleyman said.


Sunday, 3 March 2013

Govt allows IT SEZs to set up backup centres anywhere in India

The government has allowed IT and ITeS special economic zones to set up disaster recovery centres outside their limits at any part of the country, meeting the long-pending demand of the industry.

Issuing the guidelines for setting up of disaster recovery centres (DRC) and business continuity plan (BCP) for IT/ITeS special economic zones, the Commerce and Industry Ministry said the locations for such facilities will be approved by the respective development commissioner.

"The DRC/BCP location will be approved by the development commissioner (DC) on an application made by the SEZ unit. Such approval will allow the unit to relocate its operations, data and employees to the DRC/BCP location upon the occurrence of a disaster," it said.

However, it said that as this activity is envisaged as a purely internal exercise to be carried out across branches of the same SEZ entity to ensure that business continuity, there will be no commercial activity involved and accordingly, no commercial invoice will be raised in such movement of data, operations and employees.

"It was a long pending demand of the industry. IT/ITeS SEZs need such facilities at the time of any type of disasters. It will certainly help the sector," an official said.

Prevention and creating data back up is an integral part of the sectors' DR/BCP strategy.

"The data are regularly backed up at locations which are isolated from the main business centres to prevent loss in the event of a disaster. This would entail movement of data from SEZ to a DR/BCP location outside the SEZ and movement of storage media back into the zone," it said.

It also said that movement of data from outside the zone would not be treated as exports besides a record of movement of magnetic, storage tapes and devices would be maintained at the tax free enclave.

However, the unit would have to pay necessary duty on the tapes and storage tapes on which the data is being moved.

It said that the back up location where the "devices are moved could be a location under another SEZ or export oriented units i.e. a bonded secured location".

Further, the guidelines have comprehensively defined the term 'disaster' and has classified it into categories - natural and manmade. The manmade disasters include hazardous material spills, infrastructure failure or bio-terrorism.

It has also provided norms for setting up of these centres by a third party client.

The move assumes significance as out of over 160 operational SEZs, about half of them relate to IT/ITeS SEZs.


Thursday, 28 February 2013

Mobile industry to employ 10 million globally: Report

 The mobile industry will invest $1.1 trillion by 2017 and the ecosystem around it is expected to employ 10 million people globally, said a report released by global industry body GSM Association said.

"For the period through 2017, the mobile industry will invest $$2.6 trillion to public funding. Importantly, in 2017, companies across the ecosystem will employ nearly 10 million people globally," 'The Mobile Economy 2013' report prepared Developed by GSMA and consulting major AT Kearney said.

The report said revenue from total mobile ecosystem revenues reached $1.6 trillion -- around 2.2 per cent of the global Gross Domestic Product ( GDP).

"To fully realise this future and to enable the mobile industry to maximise its investments, it is essential that we establish a light-touch regulatory environment, based predominantly on competition, and develop new business models that will allow all ecosystem participants to benefit from the mobile economy," GSMA Director General Anne Bouverot said.

The report said it expects a further 700 million subscribers will be added by 2017 and the 4 billion-subscriber milestone will be reached in 2018 across the globe.

At the end of 2012, there were 6.8 billion mobile connections worldwide and the study expects it to grow to 9.7 billion by the end of 2017.

High speed internet on mobile phone accounted for 1.6 billion of these connections in 2012, increasing to 5.1 billion in 2017, including 920 million LTE connections, the report said.

Mobile subscriber penetration globally stood at 45 per cent while mobile connection penetration is currently 94 per cent.

As per GSM Association Wireless Intelligence, the variance between the number of mobile subscribers and the number of mobile connections is related to multiple sim ownership as well as inactive sims.


Monday, 25 February 2013

HP's webOS operating system to power LG TVs

Hewlett-Packard Co said it will sell the webOS operating system to South Korea's LG Electronics Inc, unloading the smartphone software it acquired through a $1.2 billion acquisition of Palm in 2010.

LG will use the operating software, used in now-defunct Palm smartphones years ago, for its "smart" or Internet-connected TVs. The Asian electronics company had worked with HP on WebOS before offering to buy it outright.

Under the terms of their agreement, LG acquires the operating software's source code, associated documentation, engineering talent, various associated websites, and licenses under HP's intellectual property including patents covering fundamental operating system and user interface technology.

HP will retain the patents and all the technology relating to the cloud service of webOS, HP Chief Operating Officer Bill Veghte said in an interview.

"As we looked at it, we saw a very compelling IP that was very unique in the marketplace," he said, adding that HP has already had a partnership with LG on webOS before the deal was announced.

"As a result of this collaboration, LG offered to acquire the webOS operating system technology," Veghte said.

Skott Ahn, President and CTO, LG Electronics, said the company will incorporate the operating system in the Smart TV line-up first "and then hopefully all the other devices in the future."

Both companies declined to reveal the terms of the deal.

LG will keep the WebOS team in Silicon Valley and, for now, will continue to be based out of HP offices, Ahn said.

HP opened its webOS mobile operating system to developers and companies in 2012 after trying to figure out how to recoup its investment in Palm, one of the pioneers of the smartphone industry.

The company had tried to build products based on webOS with the now-defunct TouchPad tablet its flagship product.

HP launched and discontinued the TouchPad in 2010, a little over a month after it hit store shelves with costly fanfare after it saw poor demand for a tablet priced on par with Apple's dominant iPad.

WebOS is widely viewed as a strong mobile platform, but has been assailed for its paucity of applications, an important consideration while choosing a mobile device.


Indian IT must cross Japan hurdle for next $100 billion: Experts

As the Indian IT industry crosses $100 billion (about 5.4 lakh crore) in revenues and aims for the next $100 billion, it cannot afford to ignore Japan, the world's largest IT market after the US. Most large Indian information technology providers have been present in Japan for close to 20 years but success has been slow in coming.

Of the $125-billion Japanese IT services market, Indian service providers get only $500 million. Embedded services contribute another $500 million, according to technology researcher Gartner. In all, Japan contributes less than 2% of India's software exports.

"If we fix the language and culture issues, growth will happen," said N Chandrasekaran, Nasscom chairman and chief executive of Tata Consultancy Services.

For instance, India's fifth-largest software exporter HCL Technologies trains all its employees working in Japan to speak and understand Japanese.

But even bigger is the cultural barrier. Unlike their western counterparts, companies in Japan do not do big bang outsourcing. They initially look for proof of concept, and if that works and they are comfortable with the vendor, then the relationship progresses to the long term, said Sameer Kishore, corporate vice president who heads the Japan business unit for HCL Technologies.

"From my experience of working in this market for a fair bit of time now, organisations here value relationships," said V Sriram, senior vice president and head of Japan business for Infosys.

Attitudes in Japan are also changing, forced by the rapid pace of technology changes. "The battlefield is shifting to software," said Nobuhiko Hidaka, president of Gartner Japan. Companies, which previously followed a 'rice-farmer culture of doing everything same every year' and were more inward-looking, are now changing as they globalise.

Hidaka said most applications used to be custom-built, but as Japanese companies go global, custom-built applications are being replaced by more standard packages. "Because attitudes are changing and Indian providers are winning in the global IT market, the door is open for the first time for India." In addition, the CEO is getting younger and chief information officers more westernised.

US IT providers, which have a 14% share of the Japanese IT services pie, are moving in to tap this opportunity. Firms like IBM, Accenture and HP-EDS are well-entrenched compared with the Indian providers. However, most of their operations are staffed by Japanese locals and done out of Japan.


Thursday, 21 February 2013

Infosys’ new platform pulls big data 40% faster

 Infosys on Wednesday formally launched what it says is one of the most comprehensive solutions in the big data space.

The solution, called BigDataEdge, allows enterprises to easily bring together not just the organized or structured data, but also a vast variety of unstructured data (information contained in, say, emails, document files, contracts with customers or vendors, blogs, social media, call centre voice records, videos). It then enables them to glean insights from all of this data, and take appropriate action.

One major element of the solution is a patent-pending connector framework, which automatically connects to internal and external data sources, including any new source that emerges, and which then makes pulling data together very easy.

"We have been able to reduce the time to discover and aggregate data by up to 40%," says Vishnu Bhat, head of Infosys' cloud and big data business. He says that in the case of a financial service provider, the solution was able to uncover hidden exposures in 43% of their accounts by going through all the written contracts. "Earlier, this would at best be a manual process that took many months. Now you can do it in days or weeks," he says.

The solution can convert voice calls into text to find necessary information. It uses facial recognition and similar technologies to extract information from videos.

Enterprises can then use built-in algorithms (there are some 250 of them) to obtain the insight required from a desired set of data sources, and visualize the insight using some 50 customizable dashboards. "We are able to generate insights eight times faster than is normal for enterprises," Bhat says. The solution also includes a collaboration tool that allows users across functions and regions to interact on the insights and take decisions in real time.

Bhat says the solution can even be used for specific requirements such as fraud detection or predicting network failures with its ability to match current records with historical records. For instance, people have a certain pattern of usage of their bank account. If there is a change in that pattern (because of an online fraudster), the solution quickly recognizes that and can send an alert or temporarily lock the account.

BigDataEdge is the latest in Infosys' Edge series of platforms that also includes, among others, WalletEdge, the mobile payments platform, and BrandEdge, which addresses marketing needs.


Wednesday, 20 February 2013

IBM's New PureSystems Promise to Ease Big Data, Cloud Adoption

Business IT solution development generally follows specific trends and overlapping eras. The first, calculation, occurred as digital products and services wholly replaced the adding machines and other mechanical business devices. The vendors that produced them -- including Burroughs, Sperry and IBM -- became some of early IT's biggest players.

The second, computation, mirrored the evolution of digitized solutions for increasingly complex processes and applications.

The third, consolidation, found vendors creating -- and their customers buying -- products and services that extended IT into virtually every business process and strategy.

The fourth and most recent era, comprehension, began a decade or so ago, when IT vendors began developing solutions to help customers more effectively manage sprawling on- and off-site IT infrastructures, in order to analyze increasingly large and complex volumes of information, as well as to leverage IT to improve decision making.
Confluence of Complementary Technologies

IBM's Smarter Analytics and SmartCloud solutions were among the first commercial offerings in this area. With follow-on developments -- from the Watson system that participated on the Jeopardy! game show to the PureSystems and PureData solutions introduced last year -- IBM has remained at or near the head of the pack of vendors that aim to help their clients successfully adopt and gain the maximum value from their Big Data and cloud computing investments.

This is the context for the new PureSystems and PureData solutions IBM introduced last week, including

the new IBM PureData System for Analytics, powered by Netezza (N2001): Featuring 50 percent greater data capacity per rack, it is able to crunch data three times faster than the previous (N1001) version of the platform;
A smaller PureApplication System: IBM's new "mini" model offers a "cloud in a box" solution to organizations with limited budgets and resources, and it should also open new opportunities for IBM among managed service providers (MSPs) and in growth markets looking for cost-effective solutions that don't sacrifice performance;
PureApplication System on POWER7+: This system is aimed at larger enterprises, particularly those in financial services and insurance, where uptime and performance are mission-critical;
MSP Editions for PureFlex System and Flex System: These new solutions provide a cloud deployment platform that is faster to implement, easier to manage, and more cost-effective than platforms MSPs have to build themselves, helping to cut operating expenses, such as systems administration and setup;
SmartCloud Desktop Infrastructure: Leveraging IBM's PureFlex System and Flex System solutions, this new solution aims to help IT managers easily manage, secure and deploy virtual desktop solutions, and to securely deploy desktop access to mobile devices; and
Expanded Software Patterns Catalog: In addition to the more than 325 applications across 21 industries offered by IBM's 275 ISV partners, new patterns from the company's software organization include solutions for mobile application management, application integration, asset management and social business.
These new solutions find IBM sizing-up and -down its technologies to make them more effective in increasingly challenging business environments, and more affordable to a broader range of customers and use cases. They also reflect a confluence of complementary strategic realities -- the continuing development of ever more-robust analytics and cloud technologies, alongside the evolving needs and use cases for those solutions by IBM customers.

Think Big, Spend Small

The new PureData System for Analytics (N2001) is a good example of how this works. IBM achieved a threefold increase in density by more than doubling the overall number of hard drives a single system can support. It also improved the scan rates of those drives -- from 120 MB/sec to 130 MB/sec. Meanwhile, new, faster FGPA cores doubled the amount of data a system can process from about 500 MB/sec to more than 1,000 MB/sec.

This has significant practical effects. In comparing the cost per GB/sec scan rates, the new PureData solution will cost about one fifth of an equivalent Oracle offering, according to IBM.

The new PureData N2001 also continues Netezza's seven years of delivering 100-200 percent data growth annually. In other words, customers should not be surprised by continuing, impressive, cost-effective improvements from this corner of IBM's PureData portfolio.

IBM's new MSP-focused solutions are also worth considering. MSPs are among the most diverse in size and variety of IT customers, and many (especially those in regional markets or focusing on specific industries) are mid-sized businesses that suffer common pains -- limited funding, uncertain revenue streams and a need to "think big" while keeping expenditures small.

IBM's new "mini" PureApplication System and MSP-specific PureFlex and Flex System offerings suggest that the company has clearly taken those points to heart and is doing all it can to work with these customers. That includes providing extremely attractive funding options through IBM Global Financing for MSPs that are also business partners.

Overall, there is much to like in IBM's new PureSystems offerings. By notably enhancing performance and developing entirely new classes of solutions, the company is proving that its cloud and analytics strategies are anything but one-trick ponies.

The new solutions also reflect a constant theme among businesses of every sort: that even as technology evolves, so do the capabilities and needs of its users. As IT's current comprehension era proceeds, successful vendors will be those that -- like IBM -- clearly understand and proactively address this point for their customers' benefit.

Monday, 18 February 2013

Linaro, ARM and the Road to Total Linux Domination

Well it seems like the dust may finally be settling here in the Linux blogosphere, and Linux Girl is fervently hoping for some long-overdue rest.

We endured the launch of Rectangle with Rounded Corners 5; we patiently listened to the endless blaring fanfare surrounding Windows 8's debut. Not to mention the launch of Commander-in-Chief 2012!

Is there no end to the autumnal excitement? Now, more than a few Linux geeks are surely thinking, it's time to get back to life.

'Broad Industry Implications'

Linux Girl couldn't agree more, which is why she was so happy to come upon word of Linaro's Linux-on-ARM project, complete with the backing of Facebook and many others.

"Linaro, the not-for-profit engineering organization developing open source software for the ARM architecture, announced today the formation of the Linaro Enterprise Group (LEG) and the addition of AMD, Applied Micro Circuits Corporation, Calxeda, Canonical, Cavium, Facebook, HP, Marvell and Red Hat as Linaro members," began the announcement last Thursday.

"With significant market interest in energy-efficient ARM-based servers, industry leaders have joined together through Linaro, creating LEG, to collaborate and accelerate the development of foundational software for ARM Server Linux," the group added. "LEG benefits have broad industry implications, including time to market acceleration, lower development costs, and access to innovative and differentiated systems, fundamental to the ARM ecosystem."

There might have been an unrelated din still lingering in the Linux blogosphere, but few FOSS fans failed to take note of this latest news.

"It Is Going to Pay Off'

"This is really good news," enthused Google+ blogger Kevin O'Brien. "I love what Linaro is doing, and it is going to pay off for all of us.

"Although the emphasis right now is on servers, we should note that ARM powers most of the phones and tablets out there, and running a good full Linux distro on a tablet can only be good," O'Brien added.

Indeed, "I'm excited to see the initiative regarding Linux on ARM," agreed Google+ blogger Linux Rants. "I think Linux is a motivated programmer away from working on my toaster oven, but it's great to see the dedication to making it work really well.

"This is yet another stone in the road to total Linux domination," Linux Rants added. "The sooner the better."

'It Looks Promising'

Linaro could "take hold of the huge production of ARMed smart thingies and laptops," suggested blogger Robert Pogson.

In fact, "it may be a great end-around play against Wintel," Pogson added. "There are quite a number of x86 PCs with OEM-installed GNU/Linux, but there are many more ARMed machines being shipped. There is no reason in the world that GNU/Linux could not come pre-installed on ARMed devices."

ARM is apparently "coming fast into the scene, and to stay," agreed Gonzalo Velasco C., a blogger on Google+. "Even AMD is supporting building more Linux applications for ARM based servers! And this means we're going to have more low-power, high-performance, hyperscale processors for servers."

Red Hat, Canonical, Samsung and ST-Ericsson "are already there," he added. "So, it surely looks promising. Lucky us computer users!

"We'll take advantage of them without knowing (when using several internet/cloud services) and hopefully this will lead to new desktop, laptop and netbook multi-core economic processors," he said. "Who knows, but Apple might have to go back to RISC processors after some time [laughs]."

'Intel's Game to Lose'

"I think we are about to see the end of the ARM bandwagon, as ARM simply doesn't scale; it still hasn't reached even Pentium 4 levels of IPC, and companies like Nvidia are having to throw more and more cores at the problem," hairyfeet explained. "We all know how well that's worked for AMD -- more cores equal more power, and without the IPC it's like racing a car in second gear: it sounds fast, but you're not getting anywhere any quicker."

So, "final verdict?" hairyfeet went on. "Sell those ARM chips while you can. They'll be hot for another year, possibly two, just depends on how serious Intel is.

"After that it'll be Intel's game to lose," he concluded. "They can simply throw more resources at any given problem than anybody else, and performance per watt is something they are sinking serious money into."

'More Free and Open'

Nevertheless, "one of the advantages of working with a FOSS system is that if you think Linux would work better on your preferred platform, you can do it yourself or pay to have it done without having to wait for an OS vendor to decide for you," consultant and Slashdot blogger Gerhard Mack pointed out.

Indeed, "perhaps this is the way to go to GNU/Linux," Google+ blogger Alessandro Ebersol agreed.

"I mean, it's a more free and open environment to GNU/Linux, not restrained by two big evil companies (Microsoft and Intel)," he pointed out. "So, GNU/Linux on ARM can blossom and evolve much more than in the x86 world."


Sunday, 17 February 2013

How internet and IT empower Indian SMEs

Over the last decade or so, information technology has taken some giant steps and this exponential growth has changed the way we live and interact with the world. No wonder information technology has enabled many small and medium businesses to carve a niche for themselves. Indeed, IT has triggered the growth of many SMEs and SMBs. And now Indian SMEs are coming forward and adopting the technology like never before to improve their output. When it comes to implementing the latest technology for their business needs, a lot of Indian SMEs are on par with their western counterparts.

IT integration

Slowly but surely, Indian SMEs are willing to invest in new IT solutions to improve their output and efficiency. Over the last couple of years, Indian SMEs have been adopting Enterprise Resource Planning (ERP), clubbing it with Customer Relationship Management (CRM) and sometimes even with Supply Chain Management (SCM). These small and medium enterprises have realised that to maximise the potential of available information technology, IT products must be implemented in tandem.

"Indian SMEs are showing willingness to implement IT products as a whole. They are going for a 360-degree approach. BusinessObjects, Kautilya and Site catalyst are some of the IT products which have become very popular among Indian SMEs. These products not only save a lot of time, but also make functioning of an organisation really smooth. They also aid in creating a tailor-made database and at times this stored information comes in handy," says Raaj Seth of Arth Salution.

However, what exactly is IT integration?

"Well," Raaj explains, "it is an amalgamation of two or more already-existing information systems. It is just like Hollywood movie The Avengers, where all the heroes with their absolute power, were clubbed together to maximise their strengths. Similarly, when you join all these IT products together, the results are exponential. Technically speaking, instead of running ERP, CRM, SCM and BPR on separate software, an SME runs them on an integrated software so that the entire information can then be shared and viewed by one and all."
But what is inspiring Indian SMEs to adopt various emerging IT solutions?
"They have realised that they need to make the most of the IT products to render the desired results. They have also realised that IT is an integral aspect of an organisation. This makes the functioning of an organisation easy and helps them meet the potential clients. They also know that embracing IT is something which will enable them to grow domestically and globally as well," explains Anuj Mathur, CEO of Q3tech.

Agrees Dr Alok Bharadwaj, executive vice president, Canon. "IT services and solutions have now become an integral lifeline for all organisations. In the technology world, following the global model provides the competitive edge and Indian SMEs are rapidly exploring to modernise. Fortunately, business cloud service propositions from IT service providers are making entry barriers low for SMEs," he informs.

A strong online presence

Most of the Indian SMEs are using internet as a very effective tool to reach out to their potential clients. Success of e-commerce industry bears testimony to this changing trend. Indian SMEs are roping in professional website developers and SEO experts to make an impressive online presence.

"Today the websites of small or medium businesses, be it B2B or B2C, showcase all the updated information. So when a potential client visits a website, he is able to view product details like its services, existing clients, pricing, etc. Besides that, a client is also given an opportunity to place an order through the website; make online payment and track the delivery status of the ordered products," points out Raaj.

"Most of the SMEs have realised that they cannot afford a bad user experience. Your website must be just like a one-point-stop where everything can be done without physically getting in touch with the concerned SME. If you have a portal, then make the most of it," avers Anuj. He further added that when a client is willing to place an order through your website, he must be empowered to do so.

"In this age of technology, what is the first thing that we do when we hear about a company? We Google it and that is where your online marketing and showcasing of products begin. Your first impression has to be extraordinary," concludes Anuj.

Around the world 

In Europe and the US, an SME is an enterprise which has up to 10 employees. In New Zealand, Canada and Israel, an SME is the one which has 19 or fewer employees. In India, an SME's investment in plant and machinery is less than Rs 25 lakh.

Secure your zone

Many a time, companies fall prey to internet frauds. Cybersecurity expert Jubin Thomas lists some preventive measures that can be undertaken to minimise cyber frauds

* Invest in security technology. Though it may be as basic as it could be (some security is anytime better than having none)

* According to researchers, a computer (without an antivirus) connected to the internet is known to be more vulnerable to infection than one having the basic security mechanisms in place Using licensed versions of software programmes and operating systems which are updated

* Identifying 'too good to be true' schemes (ponzi and lottery scams, phishing, etc)

* Training the staff (teach employees to identify threats like phishing, spear-phishing, social engineering and following good email practices)

* Using site advisors to verify the authenticity of websites (not all attacks occur from outside the building, it can be an insider job, too)

* It is important to have regular background checks on employees and this should be considered a part of the employment process

* Create company policies Escalate/report incidents through an appropriate medium

* It is important to report the incident to an appropriate person when an incident is encountered. Though the incident may seem minor, it is important to report it to the authority responsible and capable enough to handle it


Friday, 15 February 2013

PC that can recovers itself from crashes

Researchers have created a computer, which can immediately recover from crashes by repairing corrupted data.

University College London computer scientist Peter Bentley and Christos Sakellariou invented the device, which has been dubbed a "systemic" computer, in which information is married up with instructions on what to do with it, New Scientist reported.

Each system of the device has a memory featuring context-sensitive information meaning it can only interact with other, similar systems.

Rather than using a program counter, the systems are executed at times chosen by a pseudorandom number generator, which is designed to mimic randomness of nature.

Bentley said that the systems then carry out their instructions simultaneously, with no system taking priority over the others.

The device also has multiple copies of its instructions, which are distributed across its many systems, which means that if one system gets corrupted the computer can access another clean copy to repair its code.

And unlike conventional operating systems that crash when they can't access a bit of memory, the systemic computer carries on regardless, as each system has its own memory.

Source- TOI

Wednesday, 13 February 2013

DNA Could Become the Next Big Data Warehouse

Researchers at the European Bioinformatics Institute (EMBL-EBI) on Wednesday announced their success at storing data by encoding it to DNA. The system could stand the test of time -- tens of thousands of years, perhaps.

This method for archiving data could make it possible to store 100 million hours of high-definition video in about a cup of DNA, according to the scientists, and given the trend toward Big Data, that could be a big breakthrough. One gram of DNA could hold as much as information as more than a million CDs.

Unlike existing methods of data storage -- all of which have relatively limited life spans -- DNA has proven it can endure, literally, for ages. Like any physical carbon-based object, DNA can be destroyed, but it happens to be far more sturdy than paper or tape, and it can't easily be damaged by electromagnetic fields.

"We already know that DNA is a robust way to store information, because we can extract it from wooly mammoth bones -- which date back tens of thousands of years -- and make sense of it," said Nick Goldman of EMBL-EBI. "It's also incredibly small, dense, and does not need any power for storage, so shipping and keeping it is easy."

DNA could have an advantage over many current methods of storage.

Although tape is the cheapest storage medium, it's performance is lacking, explained Fang Zhang, storage analyst at IHS iSuppli. Analyzing Big Data using tape would take much longer, compared to SSD and HDD. Depending on how frequently it's used, tape could wear out.

Tomorrow, Tomorrow and Tomorrow

While it's highly unlikely that the words of William Shakespeare would ever be lost, 154 of the Bard's sonnets have been spelled out using DNA. An audio file containing part of Martin Luther King, Jr.'s 1963 "I Have a Dream" speech has also been encoded.

Being stored in DNA could allow those famous words to live on for eons.

"[This is] incredibly durable tagging for living things -- tagging that could transcend generations," said Rob Enderle, principal analyst at the Enderle Group. "The most obvious use would be to record rights into genetically created plants and animals to preserve rights and prevent illegal cloning/copies."

That isn't to say that there are no hurdles to clear. For one, scientists had to develop a code that used the four molecular letters -- also known as "bases" -- of genetic material. These consist of G, T, C and A -- a fairly limited alphabet. Then again, binary code consists of just 0 and 1, and it serves as the basis for most computer languages.

"At some future point, you might actually embed notes on how the plant or animal was created in the DNA," Enderle told TechNewsWorld. "In the case of a weaponized biological agent, you could also use this to better identify the source, should it be released, and you might be able to brand a benign virus and use it to model how a similar hostile virus would spread using a combination of DNA labeling and then population sampling to track the spread."

Knowing the Code

The key to ensuring that this data can be archived and also accessed is preserving knowledge of the code. There are numerous undeciphered writing systems that could hold long lost information. However, the EMBL-EBI researchers don't think this will be a problem.

They're worked to create a code that is error tolerant in molecular form. As long as someone knows the code, the data can be read back.

Despite concerns, DNA could be the storage method of the future, especially as Big Data begins to take us from the world of gigabytes to xeobytes.

"DNA as storage represents a new paradigm," said James Canton, Ph.D., of the Institute for Global Futures. "In a world generating xeobytes of data, we are facing a huge data tsunami. We're looking at ways to store, encrypt and secure all this data."

"DNA represents one paradigm for the future," Canton told TechNewsWorld. "DNA as a paradigm for storage represents an emerging platform towards quantum mechanics. That's when things will really change."


Tuesday, 12 February 2013

Nasscom: Indian IT sector exports to grow 12-14% in FY14

Exports from India's IT outsourcing sector are expected to grow between 12 and 14 per cent in the fiscal year starting April to as much as $87 billion, according to the National Association of Software and Services Companies (Nasscom), the industry lobby.

Exports from the industry, which counts the United States and Europe as its biggest markets, were estimated to have grown 10.2 per cent to $75.8 billion in the current fiscal year, Nasscom said on Tuesday.

The global economic uncertainty caused corporations in the United States and Europe to cut back on IT spending, leading Nasscom to bring down its forecast for the current fiscal year.

Exports would only grow to touch the lower end of an earlier forecast of 11-14 per cent, Nasscom had said in November.

Strong December-quarter results from second-ranked Infosys and its peers, including top-ranked Tata Consultancy Services, prompted investors to speculate that the coming year will see an increase in IT spending, while overall budgets are expected to remain largely unchanged.


Wednesday, 6 February 2013

After the Storm: Social, Virtualization Allow Business as Usual

Today's workforce never truly switches off due to the proliferation of mobile devices and the blurring line between our personal and professional lives. Some have questioned whether this "always on" mentality will ultimately cause more harm than good, both for individuals and businesses as a whole.

While the jury remains out on that question, one positive aspect of the constant availability of technology is the impact on business continuity and disaster recovery.

Despite the widespread devastation caused by Hurricane Sandy in October, many businesses in the affected area were able to maintain some level of continuity throughout the storm, drawing on mobile, social and virtual technologies to maintain communications with customers, employees, partners and suppliers. Here's a look at how new technology has fostered greater business continuity, even in the face of unexpected events.

Social Media

When you consider how quickly social networks have evolved from use by college kids to accepted platforms for business communications, it's truly staggering. The powerful reach of these publicly hosted channels gives organizations the ability to share updates on business-critical issues when their own networks and websites are down due to a storm or other disaster.

Social networks also enable companies to disseminate information to a wider audience that may not be as easily reached via more traditional methods. In the case of Sandy, for example, many retailers used Twitter and Facebook to update shoppers on which store locations were open.

Social networks also provide an avenue for organizations to stay in touch with their employees during unexpected events -- providing updates on office openings, public transportation, relief efforts for impacted colleagues and other real-time updates that were incommunicable prior to the rise of social channels.

Mobile Devices

A decade ago, it was uncommon for anyone outside of the C-suite to have email access when not physically in the office. Today it's safe to say that the majority of employees have their smartphone or tablet hooked up to their corporate email.

And because many companies have optimized enterprise applications for the mobile platform, users can work remotely in much the same capacity as they would in the office, even if only armed with a mobile device.

Faster wireless networks and mobile hotspots also mean that employees can quickly connect to the Internet even if their work or home Wi-Fi is wiped out by a storm.

Using Virtualization

By definition, virtualization extends the traditional enterprise far beyond its physical limits. Many companies leverage server virtualization for disaster recovery, which is a much faster option for restoring systems than traditional tape backups.

In Hurricane Sandy's aftermath, many companies built virtual instances of employees' systems so they could resume work from alternate locations. By untying software and data from physical machines, businesses can more easily relocate impacted workers and get them back up and running with all the tools and information they need to conduct business as usual.

Remote Support

Another innovation of virtual technology, remote support solutions enable IT to securely access end users' computers, take control and troubleshoot their technology no matter where they're located. In addition to saving IT time and money, remote support solutions can be used to remotely build systems and virtual instances (as mentioned above) following disasters. This speeds up the recovery efforts for impacted offices, as IT reps halfway around the world can use the technology to rebuild systems and support users in the affected area as if they were on site.

Of course, this is not to say that hurricanes or other natural disasters still don't have a significant negative impact upon businesses. But technological advances have made it possible for organizations to continue operations in some capacity during unforeseen events.

For those companies that have yet to take advantage of mobile, virtualization and other innovations, one has to wonder if events like Sandy may cause them to reconsider. After all, enabling employees to be productive outside the office can also increase productivity during normal operations.

Whereas just a few years ago organizations had to start from square one following a major business interruption, today's technology enables companies to achieve relative continuity throughout and get back to speed much more efficiently following an unforeseen disruption.