About Me

Thursday, October 21, 2010

Challenges in Testing of mobile applications

The way we access the net has changed drastically in the past few years. Today a person can manage his documents using Google docs, find the shortest route to that important client meeting using Ovi Maps, sift through e-mails on his Yahoo account using his Opera mini browser, check out what his friends are up to using his Facebook widget, chat with his friends using Meebo; without even being near to his laptop. 

Welcome to the world of Mobile web. Mobile applications today have enabled us to literally have all the information we need at our fingertips. These applications are now integrating with our banks, hospitals and workplaces and are set to become an inseparable part of our lives.
According to Gartner, the worldwide mobile applications market is currently estimated to be around $6.2 billion. In 2013, this market is expected to be around $21.6 billion. It is nobody’s guess that this market has huge untapped potential. For the companies that are involved in creating and testing mobile-based applications, this is the tipping point.

However mobile applications need to be carefully developed and tested, as unlike a computer, the safety infrastructure of a mobile device may not be comparatively impenetrable. Recent doubts regarding Citibank’s Mobile Banking Application’s security are a case in point. Citibank released its mobile application for iPhone in March 2009, using which customers could keep a track of their account. In July 2010, Citibank released an updated version and asked the users to update to this as the earlier version had a security flaw that rendered critical account information vulnerable to attacks. 

For developing a deeper understanding of how mobile applications need to be tested, we need to appreciate the fact that mobile devices is a new market altogether and has its own dedicated concerns. These concerns arise from a host of unique features that make mobile device market different from PC’s. These features are:

Multiplicity of mobile platforms
Multiplicity of mobile platforms is one of the biggest challenges that the industry faces today. The handset manufacturers are using vastly different platforms like Symbian, Memo, Bada, Android, J2ME, Blackberry and Brew. An application needs to be tested on all of them to ensure that it works for everyone. If a company plans to test for only a few operating platforms, it runs the risk of locking out potential customers who might be using other platforms.

Multiplicity of Cell phone Models
Every handset maker offers hundreds of models to its customers. These models have different screen sizes, hardware configurations and image rendering capabilities.  An exhaustive test conducted across these devices is the only way to ensure that an application works on all cell phones. Otherwise, your application would become like one of those mobile anti-viruses that take up so much of memory that a user cannot do anything else, thus defeating the very purpose they were installed for, to let the user surf the net safely. However, an exhaustive test can prove to be a very expensive and time-consuming process. Testing process should focus on careful selection of the sample and ensure that an application delivers optimum performance for all desired configurations of hardware. 

Different Carriers
There are over 400 mobile operators across the world offering GSM, CDMA and some lesser-known local networking standards. Each of these operators, to steer clear of the technicalities, does things in a different way. There are a host of issues involved in enabling an application for use on different carriers. Each network provider has implemented systems that behave slightly differently from the other as they use technology from different vendors. Therefore, an Airtel network is different from a Vodafone one and both require tackling slightly different challenges. Also, network providers insert web proxies to dictate the transfer of information on their system. ‘Walled garden’ approach, where users are allowed to access only a few pre decided websites is an example of data control by the operators. Another technique worth a mention here is trans-coding. Transcoding scales down fixed web content to ensure it fits the mobile phone screens. With different carrier requirements, testing needs to focus on all target carriers and ensure that the application functions on all of them.

Location
Location is another major constraint in testing mobile applications. An application should ensure that it does not eat bandwith or fails to function when signals are low. For e.g. with A-GPS devices, that use cell phone tower triangulation to identify your location, sometimes low signals result in a loss of data. Suddenly a passenger is left wondering where to turn next. Therefore, these applications need to be tested for their capability to work with low signals also. In fact, to completely test a network, the tester needs to be fully connected to the target network. It means that you need to be in China to test on China Mobile and in India to test on BSNL.
Summed up, these challenges make independent testing of mobile applications a complex and expensive affair. However, herein lies the opportunity for today’s entrepreneurs to devise new ways for addressing these issues. Emerging technologies like cloud computing can go a long way in addressing the challenges of testing mobile applications. Creation of tester communities that can test an application for particular geographies can be the next big idea for this industry.
The key is to constantly innovate and identify new business processes that maximize revenues and cut costs without compromising on service requirements.

Investment in open source software set to rise

The open source software market has reached a turning point, with organizations in the United States, United Kingdom and Ireland now committing to clear strategies and policies for open source software development. 

According to the findings of a survey released, more than two-thirds of organizations (69 percent) anticipate increased investment in 2010, with more than a third (38 percent) expecting to migrate mission-critical software to open source in the next twelve months.
The survey of 300 large organizations in both the private and public sector found that half of the respondents (50 percent) are fully committed to open source in their business while almost a third (28 percent) say they are experimenting with open source and keeping an open mind to using it.
Furthermore, two-thirds of all respondents (65 percent) noted that they have a fully documented strategic approach for using open source in their business, while another third (32 percent) are developing a strategic plan. Of the organizations using open source, almost nine out of ten (88 percent) will increase their investment in the software in 2010 compared to 2009.

Through both research and work with clients, we can see an increase in demand for open source based on quality, reliability and speed, not just cost savings. This is a significant change from just two years ago when uptake was driven mainly by cost savings. We can expect to see this trend develop as open source continues to evolve and address even more business critical functions.

When it comes to the benefits of open source, the cost was no longer viewed as the key benefit, with respondents focusing instead on other aspects:
* 76 percent of respondents in the UK and US cited quality as a key benefit of open source
* Two-thirds overall (70 percent) cited improved reliability
* Better security/bug fixing was cited by nearly as many (69 percent) across both countries. 

Cost control with open source
Although cost savings are not the primary driver for open source adoption, half of the respondents (50 percent) do cite open source as contributing to an overall lower total cost of ownership.
When asked about the greatest cost savings in open source, the vast majority of organizations surveyed believe they can be made on software maintenance costs (71 percent), initial software development time (33 percent) and initial development costs (33 percent).
Open source software development on the rise but companies still not so open to sharing
The volume of open source software development is set to rise over the next three years. In 2009, 20 percent of software developments were in open source. This is expected to rise marginally to 23 percent in 2010 and to 27 percent by 2013. 

One notable finding, however, is that less than a third (29 percent) are willing to contribute their own solutions back to the community.

Top 10 strategic technologies for 2011

Gartner, Inc. highlighted the top 10 technologies and trends that will be strategic for most organizations in 2011. The analysts presented their findings during Gartner Symposium/ITxpo, being held through October 21.
Gartner defines a strategic technology as one with the potential for significant impact on the enterprise in the next three years. Factors that denote significant impact include a high potential for disruption to IT or the business, the need for a major dollar investment, or the risk of being late to adopt.
A strategic technology may be an existing technology that has matured and/or become suitable for a wider range of uses. It may also be an emerging technology that offers an opportunity for strategic business advantage for early adopters or with potential for significant market disruption in the next five years.   As such, these technologies impact the organization's long-term plans, programs and initiatives.
“Companies should factor these top 10 technologies in their strategic planning process by asking key questions and making deliberate decisions about them during the next two years,” said David Cearley, vice president and distinguished analyst at Gartner.
“Sometimes the decision will be to do nothing with a particular technology,” said Carl Claunch, vice president and distinguished analyst at Gartner. “In other cases, it will be to continue investing in the technology at the current rate. In still other cases, the decision may be to test or more aggressively deploy the technology.”

The top 10 strategic technologies for 2011 include:

Cloud Computing.
Cloud computing services exist along a spectrum from open public to closed private. The next three years will see the delivery of a range of cloud service approaches that fall between these two extremes. Vendors will offer packaged private cloud implementations that deliver the vendor's public cloud service technologies (software and/or hardware) and methodologies (i.e., best practices to build and run the service) in a form that can be implemented inside the consumer's enterprise. Many will also offer management services to remotely manage the cloud service implementation. Gartner expects large enterprises to have a dynamic sourcing team in place by 2012 that is responsible for ongoing cloudsourcing decisions and management.

Mobile Applications and Media Tablets.
Gartner estimates that by the end of 2010, 1.2 billion people will carry handsets capable of rich, mobile commerce providing an ideal environment for the convergence of mobility and the Web. Mobile devices are becoming computers in their own right, with an astounding amount of processing ability and bandwidth. There are already hundreds of thousands of applications for platforms like the Apple iPhone, in spite of the limited market (only for the one platform) and need for unique coding.
The quality of the experience of applications on these devices, which can apply location, motion and other context in their behavior, is leading customers to interact with companies preferentially through mobile devices. This has lead to a race to push out applications as a competitive tool to improve relationships and gain advantage over competitors whose interfaces are purely browser-based.

Social Communications and Collaboration. 
Social media can be divided into: (1) Social networking, social profile management products, such as MySpace, Facebook, LinkedIn and Friendster as well as social networking analysis (SNA) technologies that employ algorithms to understand and utilize human relationships for the discovery of people and expertise. (2) Social collaboration, technologies, such as wikis, blogs, instant messaging, collaborative office, and crowdsourcing. (3) Social publishing, technologies that assist communities in pooling individual content into a usable and community accessible content repository such as YouTube and flickr. (4) Social feedback - gaining feedback and opinion from the community on specific items as witnessed on YouTube, flickr, Digg, Del.icio.us, and Amazon.  Gartner predicts that by 2016, social technologies will be integrated with most business applications. Companies should bring together their social CRM, internal communications and collaboration, and public social site initiatives into a coordinated strategy.

Video. 
Video is not a new media form, but its use as a standard media type used in non-media companies is expanding rapidly. Technology trends in digital photography, consumer electronics, the web, social software, unified communications, digital and Internet-based television and mobile computing are all reaching critical tipping points that bring video into the mainstream. Over the next three years Gartner believes that video will become a commonplace content type and interaction model for most users, and by 2013, more than 25 percent of the content that workers see in a day will be dominated by pictures, video or audio.

Next Generation Analytics.
Increasing compute capabilities of computers including mobile devices along with improving connectivity are enabling a shift in how businesses support operational decisions. It is becoming possible to run simulations or models to predict the future outcome, rather than to simply provide backward looking data about past interactions, and to do these predictions in real-time to support each individual business action. While this may require significant changes to existing operational and business intelligence infrastructure, the potential exists to unlock significant improvements in business results and other success rates.

Social Analytics.
Social analytics describes the process of measuring, analyzing and interpreting the results of interactions and associations among people, topics and ideas.. These interactions may occur on social software applications used in the workplace, in internally or externally facing communities or on the social web. Social analytics is an umbrella term that includes a number of specialized analysis techniques such as social filtering, social-network analysis, sentiment analysis and social-media analytics. Social network analysis tools are useful for examining social structure and interdependencies as well as the work patterns of individuals, groups or organizations. Social network analysis involves collecting data from multiple sources, identifying relationships, and evaluating the impact, quality or effectiveness of a relationship.

Context-Aware Computing.
Context-aware computing centers on the concept of using information about an end user or object’s environment, activities connections and preferences to improve the quality of interaction with that end user. The end user may be a customer, business partner or employee. A contextually aware system anticipates the user's needs and proactively serves up the most appropriate and customized content, product or service. Gartner predicts that by 2013, more than half of Fortune 500 companies will have context-aware computing initiatives and by 2016, one-third of worldwide mobile consumer marketing will be context-awareness-based.

Storage Class Memory.
Gartner sees huge use of flash memory in consumer devices, entertainment equipment and other embedded IT systems. It also offers a new layer of the storage hierarchy in servers and client computers that has key advantages — space, heat, performance and ruggedness among them. Unlike RAM, the main memory in servers and PCs, flash memory is persistent even when power is removed. In that way, it looks more like disk drives where information is placed and must survive power-downs and reboots. Given the cost premium, simply building solid state disk drives from flash will tie up that valuable space on all the data in a file or entire volume, while a new explicitly addressed layer, not part of the file system, permits targeted placement of only the high-leverage items of information that need to experience the mix of performance and persistence available with flash memory. 

Ubiquitous Computing. 
The work of Mark Weiser and other researchers at Xerox's PARC paints a picture of the coming third wave of computing where computers are invisibly embedded into the world. As computers proliferate and as everyday objects are given the ability to communicate with RFID tags and their successors, networks will approach and surpass the scale that can be managed in traditional centralized ways. This leads to the important trend of imbuing computing systems into operational technology, whether done as calming technology or explicitly managed and integrated with IT. In addition, it gives us important guidance on what to expect with proliferating personal devices, the effect of consumerization on IT decisions, and the necessary capabilities that will be driven by the pressure of rapid inflation in the number of computers for each person.

Fabric-Based Infrastructure and Computers. 
A fabric-based computer is a modular form of computing where a system can be aggregated from separate building-block modules connected over a fabric or switched backplane. In its basic form, a fabric-based computer comprises a separate processor, memory, I/O, and offload modules (GPU, NPU, etc.) that are connected to a switched interconnect and, importantly, the software required to configure and manage the resulting system(s). The fabric-based infrastructure (FBI) model abstracts physical resources — processor cores, network bandwidth and links and storage — into pools of resources that are managed by the Fabric Resource Pool Manager (FRPM), software functionality. The FRPM in turn is driven by the Real Time Infrastructure (RTI) Service Governor software component. An FBI can be supplied by a single vendor or by a group of vendors working closely together, or by an integrator — internal or external.

Farmers Urged To Embrace Technology

 


African farmers have been urged to use technology to boost food production on the continent. Low levels of agricultural productivity over the years have contributed to the recurrent food shortage that affects over 30 per cent of the population, representing about 260 million people.

This problem can be addressed if farmers equip themselves with up-to-date agronomic knowledge, embrace technologies that promote adequate fertilizer application as well as use improved seeds that guarantees high yields.

There are plans to boost food production in tropical Africa and Pedro Sanchez, Director of Tropical Agriculture and Rural Environment at the Columbia University, New York believes that it is possible to save many hungry people in Africa.

Interacting with visiting international journalists at a seminar on ‘Development and poverty reduction’ organized by the German based Capacity building institution- Inwent and the Initiative for Policy Dialogue of the Columbia University, New York, Mr Sanchez, who is also the Director of the Millennium Villages Project at the Earth Institute of the university, said research in many African countries had shown that the soil lacks nitrogen which could be replenished through prudent use of fertilizers.

As a solution, agronomists and soil scientists are collaborating with Google to use the latest technology for a digital soil mapping that would help farmers around the world.

The new technology, according to the Director of the African Soils information Services (AfSIS), is developing the digital soil map of the world, which would allow farmers to test the soil to ascertain the level of nitrogen required by their lands through the use of mobile phones.

“Just about everywhere in African there are mobile phones and through the soil mapping project, farmers can send questions on how much fertilizer they can apply to the soil and get answers through SMS.”

He said lack of a subsidy program, inaccessibility to farm inputs by farmers and poor extension services can be named as the underling factor behind the failure of the continent, which has the potential to produce enough food.

“It is essential that the extension services of the agricultural sector are equipped with the needed logistics.” The senior research scholar also dismissed claims that organic farming in Africa was not feasible because of its depleted soil.

He noted that it is possible to promote the high yielding hybrid seeds in Africa, explaining that the promotion of genetically modified organisms (GMOs) poses a new threat.

In most parts of Africa, Civil Society Organisations and other non-governmental organizations are against the introduction of GMOs, claiming that they pose as a threat to human lives and the environment.

“There is no scientific evidence of any damage to human health or the environment, for instance some GMOs like B cotton grown largely in South Africa have positive environmental effect as it reduces pesticide application,” he stated.

Two different research bodies in Europe have indicated that there are only simple agronomic problems related to GMOs. “It is not a scientific argument any more but a political argument. Burkina Faso had the political courage and is now a major producer of cotton in West Africa and they are doing well.

Cartoonist maps world of social networking


A cartoonist has created a map which imagines websites as countries and shows their size based on how popular they are.
Randall Munroe, a university graduate from Massachusetts, US, has used his imagined world map to represent the levels of social activity in online communities such as Facebook, Twitter and Skype.
Munroe is best known as the best known as the creator of the cult webcomic xkcd.
A 
map created by Randall Munroe that imagines websites as countries and 
shows their size based on how popular they are.
The land mass of each mythical country named after a website equates to the popularity of that site, showing effectively how social activity is spread throughout the internet.
Munroe based his Map of Online Communities on statistical information, including website hits and the number of members each community had over the Summer of 2010, reports the Telegraph. The new map is an updated version of a map that Munroe created in 2007.
Email dominates the map and unsurprisingly Facebook features prominently, as does Twitter and Skype.
The map does throw up some surprises. Myspace, once one of the most popular social networking sites, is barely visible, only slightly larger than LinkedIn, a site which aims to connect people through business profiles.
Farmville and Happy Farm sit prominently while YouTube, the video-sharing website, is a good-sized island.
The most surprising inclusion for many is QQ, a Chinese instant messaging service which has more than 100 million users but is virtually unheard of in the west.
Munroe said: "This update map uses size to represent total social activity in a community - that is, how much talking, playing, sharing or other socialising happens there. This meant some comparing of apples and oranges, but I did my best to be consistent."

Wednesday, October 20, 2010

Why Thought Leadership Rules the B2B World

Top B2B marketers understand two key objectives 1) Building new business and 2) Becoming a thought leader. 56% of B2B marketing executives stated “positioning our company as a thought leader” as their top objective in a recent Economics Intelligence Unit study. In addition, a third of respondents felt thought leadership would be the best way for providers to market services in next 3-5 years.
But why is thought leadership dominating the B2B marketing world? In a world of generating leads and CRM, does the answer to successful B2B marketing lie within the broader control of your company in addition to the collaboration of your marketing and sales forces? Here are some things to consider.
  • You are no longer the hunter, you are now being hunted –The web has enabled top executives to gather their own information via references, social networks and search engines, as opposed to tapping the shoulder of the corporate librarian. A Forbes and Google survey shows 64% of senior executives are clicking “search” more than six times a day seeking business information. It wasn’t so long ago when B2B marketers were doing the hunting. But now, thanks to the web, they are instead being hunted. This puts huge importance on being a thought leader. Compelling, thought leadership information that resonates with a niche will increase your digital marketing KPIs ahead of competitors.
  • The thought leadership & social media marriage – B2B social media marketing builds brands. It creates momentum for awareness, loyalty and equity while strengthening lead generation efforts. Members of social networks (followers, fans, etc.) are always on the prowl for good, credible information. With 69% of B2B buyers using social networks, the opportunity to share relevant content is evident. B2B buyers and marketers are turning to social media to gather their resources, and while there, looking for industry thought leaders to share information. It’s a transitive property: If social media is dominating B2B marketing, then thought leadership is dominating B2B marketing.
  • Customers want to know “how” Prior to the widespread adoption of social media, public company information mainly consisted of the “what” a company did, and not “how” the company did it. Today, however, customers want to attach a personality to a company and, more importantly, they want to know how companies differ. Thought leadership enables customers to separate companies into the who’s who of the industry. They’re better able to understand a company’s personality and ultimately understand how processes and strategies work.
Thought leadership is continuing to gain momentum and is expected to be a huge driving force of B2B marketing strategies. Understanding thought leadership is the first step in the process. The next step consists of understanding how to effectively and successfully become the thought leader of your industry.

Tuesday, October 19, 2010

India to spend $2.3 trillion to go green

India to spend $2.3 trillion to boost its energy sector by 2030, which includes a substantial burden for expanding the country's energy basket to include green sources such as solar, wind and nuclear power.

It indents to improve energy efficiency and using clean technology to help Asia's third largest economy balance growth and environmental aims. B.K. Chaturvedi, a member of Planning Commission that charts India's growth path, recently said that being the world's third-worst carbon emitter it is essential for the country to shift to a greener economy.
   
Last year the country set a goal for slowing the growth of its emissions, saying it will try to rein in its 'carbon intensity' the amount of carbon dioxide emitted per unit of GDP by between 20 and 25 percent by 2020, from 2005 levels.

Power remains a top priority in terms of increasing energy efficiency and use of renewables as well. India aims to add about 100 gigawatts (Gw) of power generation capacity by early 2017, much of it from coal, despite conceding it would miss by 79 percent an earlier five-year target of adding 78.7 Gw by March 2012. But India will need to keep burning cheaper fossil fuel to expand the reach of electricity to half of its one-billion-plus population without power.


"We should develop this in the context of a two-pronged strategy: The first is improving energy efficiency, and the second is changing the mix of the energy which we consume. Some of it will be towards energy consumption, but a lot of it will go towards improving energy efficiency and improving the composition of energy," he added. The portion of the amount would be spent on making the shift to a green economy in the next two decades is so far not mentioned.

Smart Gadgets Go Green

Saving the planet one gadget at a time

AlertMe iPhone app
New iPhone apps connect with smart meters to tell you your energy use
Saving energy in a home full of gadgets can seem impossible but the latest home energy technology can help you save the planet - and money.

Gadgets such as smart meters are part of the government's plan to cut carbon emissions. Due in your home sometime after 2013, they show you exactly what electricity you are using, and how much it is costing.
They will also communicate with your supplier, automatically taking readings and making switching suppliers much simpler.
But you do not have to wait until the end of the decade to start cutting your consumption.
The Wattson is a simplified smart meter. A sensor clipped onto your fusebox monitors the electromagnetic field in the wires coming from it, and the accompanying transmitter sends the information, wirelessly, to the shiny Wattson box.
It shows how much power you are using right now, and how much that would cost if - rather unrealistically - you continued to keep everything on all day, every day of the year.
It really does show how switching on the kettle, the oven and any other home appliance sends your wattage sky high.
Smart meters are not magic - they are a tool and you have got to use them intelligently
Dr Sarah Darby
Research Fellow, Oxford University
The accompanying software, Holmes, is more useful. It can show usage over time, and settings can be tweaked for a particular energy tariff.
A similar system is the AlertMe Energy Home Hub, which also uses a plug and transmitter.
It connects to the internet using your router, and gives you access to your home's energy statistics via an online dashboard, a handheld controller or iPhone app.
You can get a summary on your iGoogle page by way of the Powermeter widget and the hub also talks to smart plugs, which can be switched on or off remotely. 

Magic meter

Smart meters might keep you informed but actually saving the energy - and money - is still down to you.
"Smart meters are not magic. They are a tool and they are like every other tool - you have got to use it and use it intelligently," said Sarah Darby, from the Environmental Change Institute at Oxford University.
"You can take a smart meter round the house, switching stuff on and off, to see what difference it makes to how much [energy] you are using.
"That way, you can see which [devices] are your big users and which are your small users and before you go to bed at night, you can see what is still switched on."
But if smart metering seems like too much effort, one option might be the PassivEnergy management system, which aims to cut your bills with hardly any human intervention.
The system takes over your central heating and hot water controls, using in-room thermometers, a wireless hub, and a new central heating controller to manage your energy use more efficiently.
PassivEnergy controller
PassivEnergy takes control of your central heating and hot water system
Initially, you use the handheld controller to tell the system about your normal routine - how warm you like it, when you are usually in and out, when you go to bed and when you are planning to go on holiday.
Then it starts to watch what you do and learn your habits.
Director of market intelligence for PassivSystems, Wayne Muncaster, says this is the clever part of the system.
"You can tell it when you are in, when you are out, and the hot water you believe you need.
"But what the system will learn over time is what your habits are and what you actually do - how many showers you take in the morning, whether you have a bath in the middle of the day. The system will begin to understand your lifestyle habits.
"That means the system will only fire, and actually burn energy when you actually need it, not when you tell it you need it or when you think you may need it, but when you actually begin to use the system."
If you come home unusually early, you can override your settings by pressing the "occupancy" buttons to flip the system from "out" to "in".
And if you decide not to come home at all, you can remotely control the system using an iPhone app.
It is not cheap - the complete kit costs nearly £600, with an annual fee of £20 or £30, although the company says customers can expect to save that amount on their bills in two or three years. 

Smart fridge

Using technology to save energy in your own home can save you money today, but it could also change the way we power the country tomorrow.
Currently, the national electricity suppliers need to know at any one time how much electricity the country needs, and exactly match it. If they generate too much or too little electricity, the grid will fail.
Electrocity pylons
Smart fridges adapt their electricity use to fluctuations in the national grid
The current solution is to keep several coal-fired power stations running at 50% capacity, so they can be throttled up or down at a moments notice.
It is a very inefficient, wasteful thing to do, but it is the only way to be ready for unexpected surges or lulls in demand.
Now though, clean technology company RLtec thinks it has a solution - the smart fridge.
As commercial manager Joe Warren explained, the smart fridge monitors its internal temperature as well as the status of the national grid - and then compensates accordingly to balance the grid.
"If the grid needs some assistance - if there is too much or too little electricity in the grid - and if the food is at the right temperature, the fridge can turn its motor off or on earlier than it otherwise would have done."
So far the smart fridge is just part of a trial roll-out by energy supplier nPower - balancing the national grid will of course require many, many more people to buy them.
The world may slowly be getting its head around smarter energy and governments are getting on board.
But for at least the next few years - until the devices become mandatory and affordable to all - smarter energy consumption will remain an option only to those who are willing, and able to pay for it.

Gartner sees soft growth in enterprise IT spending

Global enterprise spending on information technology should increase 3.1 percent in 2011, industry tracker Gartner said on 18 Oct 2010, as it predicted "timid" growth over the next five years.

Gartner forecast IT spending to rise to $2.5 trillion in 2011 and climb to $2.8 trillion in 2014, a gain of only 12 percent over that period.
The research group said the next five years would "represent a period of timid and at times lackluster growth."
"Several key vertical industries, such as manufacturing and financial services, will not see IT budgets recover to pre-2008 levels before 2012 or 2013," Peter Sondergaard, head of research at Gartner, said in a statement. "Emerging economies continue to be the locomotive of enterprise IT spending, substantially outpacing developed economies."
Spending is on pace to rise 2.4 percent in 2010 to $2.4 trillion, Gartner said.

Friday, October 15, 2010

The Industry Talk

How important Social Media platforms are? Or is this just a passing fad?
They are big. They are here to stay. Not a passing fad by any chance. Social Media enables people to do, what they naturally like to do, e.g. staying in touch, making new friends, belonging to groups, getting validation, etc. When something helps you do, what you anyway want to do, its here to stay!

Is there an ROI from the medium? How would one define it?
ROI for sure, if we are talking of “Risk of Inaction”! Then again, what is the ROI of the telephone? Of the accountant? Social media is part of the business ecosystem, and it recovers money at end of the day, via various contributions or alternately cost savings, that it enables. In some cases, it can drive transactions as well, and those are the cases, where ROI is faster.

What challenges does people/brands/organizations will face in accepting this medium?
The biggest challenge is one that is faced by all new breakthroughs in technology related space. Waiting for others to do it first. Waiting for case studies to happen. Waiting for later. Waiting for references. In short.. waiting!
Then, when it gets to the point where competition or a senior in the company drives one to the space, then there are mistakes made. Because decisions are made hurriedly. This will then see, knee-jerks that don’t work.
Obsession with the way things used to happen, is another challenge. Benchmarking with different types of traditional media is a challenge, when these are not comparable.

Thursday, October 14, 2010

Cloud Computing Represents 10 Percent of Spending on External IT Services in 2010

Cloud computing services consumed from external service providers (ESPs) are estimated to be 10.2 percent of the spending on external IT services, according to a worldwide survey by Gartner.
From April through July 2010, Gartner surveyed 1,587 respondents in 40 countries to understand general IT spending trends and spending on key initiatives such as cloud computing. Participants were IT budget management professionals—CIOs, IT VPs, IT directors, and IT managers. Four hundred eighty-four respondents participated in the drill-down on cloud computing and were asked how their organization's current budget for cloud computing was distributed, as well as what their estimate was for spending next year.
"The cloud market is evolving rapidly, with 39 percent of survey respondents worldwide indicating they allocated IT budget to cloud computing as a key initiative for their organization," said Bob Igou, Research Director at Gartner. "One-third of the spending on cloud computing is a continuation from the previous budget year, a further third is incremental spending that is new to the budget, and 14 percent is spending that was diverted from a different budget category in the previous year."
About 46 percent of respondents with budget allocated to cloud computing indicated they planned to increase the use of cloud services from external providers. Gartner analysts said there is a shift toward the utility approach for noncore services, and increased investment in core functionality, often closely aligned with competitive differentiation.
More respondents expected an increase in spending for private cloud implementations that are for internal or restricted use of the enterprise (43 percent) than those that are for external and/or public use (32 percent).
"Overall, these are healthy investment trends for cloud computing. This is yet another trend that indicates a shift in spending from traditional IT assets such as the data center assets and a move toward assets that are accessed in the cloud," said Igou. "The trends are good news for IT services providers that have professional services geared to implementing cloud environments and those that deliver cloud services. It is bad news for technology providers and IT services firms that are not investing and gearing up to deliver these new services seeing an increased demand by buyers."
On a regional basis, Asia-Pacific, Europe, the Middle East and Africa (EMEA), and North America spent between 40 and 50 percent of the cloud budget on cloud services from ESPs. Latin America was the exception, with a notably larger portion of budgets being spent on developing and implementing private and public cloud environments, reflecting the need to cater to the close business relationships and high-touch interactions that are characteristics of the Latin culture.
"Cloud-based IT services are evolving fast and differently in the countries and regions surveyed. Service marketing managers for IT services providers must be monitoring the contract value and intentions of customers for their service lines and cloud service offerings at the country and regional levels of their operations," said Igou. "Demand is shifting from traditional proprietary and highly customized assets to ubiquitous assets that are accessed by customers. Service marketing and service delivery managers need to lead the curve of investment in the skills and capabilities of their service offerings, which means investing before having contracts."

Next budget year to see hike in Global IT spend

Following two years of turbulence in economy worldwide, next year will also witness a 39 percent hike in Global IT spend, predicts a survey by Gartner.

While Asia Pacific expects a slightly high budget increase at 44 percent, 72 percent among them plan for increases of more than 10 percent and 36 percent expect more than 20 percent increase over the current year?s budget.
Next budget year to see hike in Global IT spend


The survey which was carried among 1,500 IT leaders in 40 countries had key initiatives like cloud computing, business process improvement, data centre expansion/consolidation, enterprise software implementation / upgrade, outsourcing, security, risk and compliance etc to understand general IT spending trends and spending. According to Derry Finkeldey, principal research analyst at Gartner, Asia Pacific is quickly in track of economic growth being the less hit regions of global recession.


Enterprise software implementations and upgrades are receiving the greatest investment focus in Asia Pacific, with 85 percent of organizations allocating budget to implementations in the current budget year, with most planning to spend at the same or higher levels in 2011. Though respondents in Asia Pacific are investing currently on expansion initiatives, 63 percent of them have not budgeted for any type of cloud service in 2010.


Finkeldey also pointed out how organizations prioritize initiatives especially cloud computing and data centre initiatives. "While we are seeing growing interest in cloud computing, most organizations are still learning when it comes to cloud services initiatives," said Finkeldey. The IT budget distribution across spending lines is fairly consistent globally, especially for telecommunications services and external IT services, with IT personnel accounting for about 30 per cent. External IT services account for 12.5 per cent of the average IT budget globally while Twenty-three per cent of IT budgets in Asia Pacific are allocated to enterprise software initiatives, with 35 per cent of budgets for new software licenses allocated to horizontal software, says Gartner.

Wednesday, October 13, 2010

IT asset management breakthrough

In an impact analysis issued late last year, Enterprise Management Associates noted that IT asset management "is coming back into prominence after existing for many years as something of a dormant, niche discipline."

The reasons? In the report, EMA cited a variety of them, including the need for more effective cost management and accounting given constrained IT budgets; the rise of IT Infrastructure Library v3, in which asset and service life-cycle management come together and take on a more prominent role; and increased pressure on IT to optimize infrastructure and minimize costs in light of green IT, virtualization and other such initiatives.

EMA suggested that these changes give rise to a new model for IT asset management, one that better melds different disciplines such as service, asset and compliance management to boost visibility, automation, risk minimization and operational efficiencies. Numara Software's Numera Asset Management Platform (NAMP ) is one good example of the new model, EMA said.

NAMP provides a unified architecture for eight integrated products that deliver a range of services including inventory and discovery, desktop configuration management, software license management and more, EMA described. Last week, Numara enhanced this IT asset management platform with additional PC and server life-cycle management capabilities.
NAMP version 10 includes three primary enhancements. The first is Service Anywhere, which enables secure, on-demand remote service over the Internet. The intended goal, Numara said, is to deliver better service to a growing mobile workforce. The second is Instant Expert, in-product guided help for IT professionals as they set up, configuration and fire up NAMP features. And the third provides tighter controls over software licenses, usage management and compliance with licensing policies.

In addition, NAMP 10 includes new usability features, including direct device access, advanced reporting capabilities, real-time auditing, index searching, alert management and switch port mapping.

Of NAMP 10, Steve Brasen, EMA senior IT industry analyst, said in a published release, "Numara has achieved a milestone in endpoint lifecycle management …. The new release simplifies many of the management challenges inherent in today's complex support stacks -- such as enabling secure remote management of clients over the Internet without the need for a dedicated gateway server -- a feature I am unaware of any other solution providing today."

Friday, October 8, 2010

How will LinkedIn Signal change the way we work?

You may have heard that LinkedIn recently launched a new service called Signal that is designed to make LinkedIn a more socially relevant application.  The service is only available on a very limited basis right now, but will probably roll out to the general LinkedIn population pretty soon.

What is Signal?  It’s a simple way of getting real time LinkedIn status updates and Twitter updates delivered in your LinkedIn interface.
Now, this may not seem like any major technical breakthrough – it isn’t – but it does potentially make LinkedIn a much more relevant application in our daily lives.
If you’re like me, you joined LinkedIn because you wanted a simple way to connect online to a community that is more business-oriented than you’d typically find on Facebook, MySpace, or other social networks.  LinkedIn has been very successful for this purpose, with tens of millions of active users.

LinkedIn, however, has never really become a part of my daily life, like Twitter and Facebook have.  I don’t check in on LinkedIn multiple times each day to see what’s going on with my network.  Contrast that to Twitter and Facebook, where I seem to be spending more and more of my time.  LinkedIn Signal may change that.

By making LinkedIn a more social application, it may suddenly become very relevant in my workday.  It is, after all, the “place” that I keep all of my professional contacts, so it would make sense that I spend time during my workday updating and looking for updates within that network – it just hasn’t really been a feature of LinkedIn to be able to do so.

With the introduction of Signal, LinkedIn collides with my other social applications – it will be interested to see six months from now where I am spending my time online.  Will it be spent in LinkedIn ? Lets wait and watch out.

Wednesday, October 6, 2010

Lack of cloud computing vision is hurting most enterprises

The use of cloud computing within most enterprises and government agencies is rather ad hoc these days. This is understandable, considering the tactical and even experimental nature of most cloud computing deployments, as well as the learning curve that most IT organizations are going through in moving to the cloud.

However, if history is any guide, I suspect that the tactical and ad hoc nature of moving to the cloud will continue to manifest itself as cloud computing continues to mature. Thus, private, public, and hybrid cloud computing services will be used without any kind of overall vision or strategy.

The core issue with a lack of strategy is that cloud computing is about driving a holistic change in IT, but doing so a bit at a time. Thus, there really should be an overreaching idea of where the enterprise or agency is heading with cloud computing technology, and then tactical solutions created around that vision or framework.

This issue is really around both value and risk: value that is increased around a clearly defined objective of finding better and less expensive ways to do computing, and the cost of risks that increases dramatically when there is no plan.

The fact of the matter is that IT is very good at buying technology and tossing it at business problems, but not so good at planning. Indeed, if you want to be the most hated man or woman in IT, try being the person in charge making sure that everything adheres to a plan or budget. You won't get invited out to many lunches, trust me.

However, for all this cloud stuff to work, you really need to focus on the end game, moreso than with any other fundamental technology shift. Those that don't will end up with another set of new cloud-based systems that are highly dysfunctional in how they work and play together. In other words, cloud computing won't be able to deliver on the expected business value.

IBM adds Notes capabilities to LotusLive

IBM has upgraded its hosted collaboration suite to include e-mail and social networking

IBM has upgraded its hosted collaboration suite, LotusLive, with additional capabilities, including a revamped e-mail and calendaring service called LotusLive Notes, and a new set of collaboration tools, called Communities.

The newly unveiled LotusLive Notes, based on IBM's on-premise Lotus Notes software, features messaging, calendars, instant messaging and contacts. Users can deploy either a Web edition or thick-client software. Previously, LotusLive had offered Web-based e-mail for $3 per user per month, but the new messaging/calendaring service starts at $5 per user per month. In the new plan, each user is allotted 25 gigabytes of storage.

Support for Notes applications has not been added yet, said Sean Poulley, vice president of IBM's online collaboration services. But the service can be integrated with an enterprise's in-house Lotus Notes deployment, using internal LotusLive capabilities and IBM's Tivoli Directory Integrator.

"No companies want to have two directories. No one wants double sign-on, but everyone has external requirements," Poulley said, noting that this integration would allow organizations to collaborate more easily outside their firewalls. "Most of what you can do in administrating mail, calendaring and scheduling in an on-premise environment, you can also administer LotusLive from the same Domino console."

LotusLive's other major new addition, Communities, might also be handy for cross-organization collaboration. It lets users interact via discussion forums, project-tracking tools, bookmarks, file sharing and information tagging. It closely resembles IBM's enterprise social-networking software Lotus Connections, though it does not include the ability to create blogs and wikis. Those tools may be available at a future date, Poulley said.

In addition to those enhancements, IBM has integrated LotusLive with Tungle's scheduling and calendar applications -- allowing LotusLive users to schedule meetings across different calendaring applications -- and with Bricsys' software for sharing, viewing and annotating documents, with support for more than 70 document types.

The entire LotusLive suite, which also comes with Web conferencing, costs $10 per user per month.

 

Green IT in high demand among IT buyers

Although interest in environmental stewardship has waned, companies continue to eye green IT to cut costs

Demand for green IT products has increased impressively over the past few years, according to new research from Forrester. Back in 2007, only 25 percent of companies had a list of green criteria for technology purchases, whereas 60 percent now have such requirements. Notably, the biggest driver for purchasing green is to cut costs, rather than to be a good environmental steward -- yet many companies are still struggling to find an ROI from embracing green IT.
Although both awareness of and demand for green IT products are high, economic hardships -- and perhaps some skepticism as to the true ROI of green tech -- pose significant barriers to adoption, according to the latest in Forrester's series "The State of Green IT Adoption. "Our survey found a significant jump (from 29 percent to 40 percent) in the number of companies stating that a lack of a clear business case or return on investment (ROI) is a factor in not having a plan in place," the report says. "Many companies remain cautious as the broader economy appears to slowly stagger out of recession, and budgets are likely to remain tight in the short to medium term."

Those companies that are embracing green tech are doing it for varying reasons. The top drive remains doing more with less: 79 percent of respondents said that improving IT efficiency was a high or critical priority.

Forrester also reports a "renewed emphasis on green criteria in 2010, driven partly by increased regulation the sourcing standards laid out in their corporate sustainability strategies and goals." (whether implemented or expected) and partly by pressure from buyers on their supplier ecosystem to comply with.

Meanwhile -- likely a reflection of economic hardships -- fewer companies report they're embracing green IT for the sake of environmental stewardship, according to the report: That number has dropped from 50 percent in October 2007 to just 30 percent now.

The most widely adopted green IT initiatives within the data center, according to the report, involve virtualization and consolidation; 68 percent of the respondents said they've already embraced those technologies, while another 22 percent said they plan to do it by 2011. In a similar vein -- that is, freeing up wasted space and IT gear in the data center -- 30 percent of respondents said they're already eliminated redundant applications, and another 35 percent are planning to do so in the next two years.

On the other end of the spectrum, techniques such as embracing modularity or adopting a DC power supply system rank low on organizations' agendas: 13 percent of the surveyed companies said they've gone modular and another 11 percent are planning to. Meanwhile, 12 percent of respondents have a DC system in place, and another 5 percent are planning on doing the same.

Outside the data center, printer consolidation and PC power management are corporations' top green IT projects; 66 percent of companies have already implemented printer consolidation, and 19 percent plan this for either 2010 or 2011. Further, 41 percent of respondents said they're using PC power management, while another 21 percent plan to deploy it over the next couple of years.

Finally, interest in ECEM (enterprise carbon and energy management) systems, which are being offered by an array of companies, from Microsoft and IBM to pure players such as PE International and Hara Software, has risen over the past few months. Last November, 13 percent of respondents said they'd adopted such systems, and the figure is now up to 19 percent. Another 17 percent are planning to embrace ECEM by 2011, and 10 percent aim to adopt such as system after 2011.

 

Monday, October 4, 2010

Google TV, all set to launch

The first devices featuring Google TV, from Sony and Logitech, will be available this month.

Almost five months after telling the world about its television aspirations, Internet search giant Google is providing more details on its forthcoming Google TV service.
The first devices featuring Google TV, from Sony and Logitech, will be available this month, Google said in a blog post on Monday.

Also Read: Google Instant Blacklist: What is it?
Google also listed a variety of media and technology companies whose content and services will be available on Google TV, including HBO, Netflix, Twitter and music video website Vevo.
Google’s efforts to conquer the living room represent another front in its increasing rivalry with Apple, with the two tech companies also competing in smartphones, tablets and mobile advertising.
In September, Apple announced a new, overhauled version of its struggling Apple TV product, which will allow users to rent television shows for 99 cents a pop from News Corp’s Fox and Walt Disney Co’s ABC (It probably doesn’t hurt that Apple CEO Steve Jobs is on Disney’s board of directors).
For its part, Google is pushing its partnership with HBO as one of its marquee premium content offerings.
The HBO service will allow existing HBO subscribers, who receive cable TV service from Comcast or Verizon, to watch up to 600 hours of HBO shows at their leisure. Of course, HBO already offered Verizon and Comcast customers the ability to do this on the web – now those folks can also watch the latest episode of “Boardwalk Empire” or “Bored to Death” on Google TV.
The big question, though, is how much Google TV will actually cost. Perhaps we’ll learn more later this week, when Logitech is holding a media conference to unveil its line-up of Google TV devices.

Salesforce.com to expand Japanese business

Salesforce.com has a global customer base of 82,400 companies, including 3,000 Japanese firms, which are serviced from data centers in the U.S. and Singapore

U.S.-based cloud computing firm Salesforce.com is taking steps to broaden its operations in Japan, the Nikkei business daily reported.

Salesforce.com plans to set up its first Japanese data center and buy more than 5 percent stake in software developer Synergy Marketing Inc for 100-200 million yen ($1.20-$2.40 million) in order to boost its annual sales to 100 billion yen, the paper said.

The company, which now generates under 10 billion yen a year from its cloud computing services in Japan, has decided to lease one floor of a data center in Tokyo from NTT Communications Corp where it will station several hundred high-performance computers and begin operations in 2011, the daily said.

Salesforce.com has a global customer base of 82,400 companies, including 3,000 Japanese firms, which are serviced from data centers in the U.S. and Singapore, the daily said.
Synergy Marketing will conduct several rounds of third-party share allocations to increase its capital starting October, the business daily said.

From January, Synergy Marketing will develop and sell new products that integrate the features of the software provided by both the firms, which will enable Salesforce.com to offer cloud-computing services covering everything from customer development to business management, the Nikkei said.

Monetizing Mobile Traffic Takes Mobile Data Intelligence

Mobile operators are facing a revenue dilemma these days. The numbers may vary according to which research you read, but the conclusion is the same: Data revenues are not even close to meeting escalating usage patterns.

By way of example, AT&T estimates that mobile data traffic grew 50-fold between 2006 and 2009. Nokia Siemens, for its part, predicts it will increase 300-fold over the next five years. Despite these increases, Nokia also predicts that data revenues for operators are only projected to triple over the next five years.

Many are trying to redress this balance before the gap widens beyond control. In an attempt to monetize data traffic usage, AT&T and O2 were among the first to revise their pricing plan structures from the "all you can eat" model and limit how much data subscribers can use.
This may be a sign of future pricing models, but it still falls short of optimizing the monetary value of data usage. In some cases, blanket price increases or restriction policies may lead to disgruntled customers and jeopardize the ability to acquire new ones. In others, operators may end up underestimating usage leading to lost revenue opportunities from heavy users; or overestimating it and overcharging lighter ones. In order to succeed, operators need to develop pricing and promotion plans that are smarter, more tailored and more flexible to meet a wider range of subscriber needs. Those pricing models will take into account the quantity of data consumed as well as the type of applications and services. 

A Different Landscape

The challenge for smart pricing lies in the inability for operators to track subscriber data usage in any significant detail. Typical business intelligence and analytics tools can only provide operators with limited views of subscriber behavior. What they end up with is a general understanding of the peaks and valleys of data traveling across networks over specific periods of time or regions. This impedes their capacity to build offers that are both relevant and valuable to subscribers while remaining profitable for them. For example, most operators would find it a challenge to build an offer that includes unlimited Facebook usage while charging for any other type of data services (e.g. email, browsing etc.)

There's a reason that "breaking down" data usage is problematic. The mobile market functions differently from others, where CRM programs can pinpoint the buying and usage patterns of specific subscribers and/or demographics through a centralized point of information capture or database.

Mobile operators, however, no longer control their subscriber base the way they used to. In the recent past, operators were able to use a dedicated portal to control and manage the customer experience, thereby enjoying ready access to usage patterns and behaviors. The past few months, however, have seen a veritable explosion of vendor-agnostic applications and app stores. That means fewer customers are using mobile operator portals to access information and services. While portal data remains useful, it is far less representative of user behavior.

A New Intelligence

The new form of mobile data intelligence can go beyond the basics to provide actionable metrics that drive smarter pricing and marketing decisions. With the kind of understanding MDI provides, operators can be much more innovative and proactive in developing customized plans that better suit their subscriber - and bottom line - needs. It enables marketers to take into account quantity but also the type of applications and services consumed by subscriber segments.
An MDI solution links information about all data services such as data volume, usage patterns, and device information, to create the next generation of KPIs, including:
  • For each service: # of users, data volume, # of sessions
  • For each service: top devices by # of users, data volume, # of sessions
  • For each service: peak usage times and locations in terms of users, data volume, sessions
  • For each device: top services by # of users, data volume, # of sessions
These KPIs reflect the richness of the mobile data environment -- not just a few services coming from a single provider. In addition, they capture detailed information about any service and application moving across an operator's network, regardless of the source.

Digging Deeper With Mobile Data Intelligence

In addition to tracking fundamental trends mentioned here, marketers can also track difficult-to-obtain trends such as device fleet distribution, including inbound roamers and gray-market devices. They can also use MDI to uncover "hidden segments" (e.g. top devices for social networking, top application downloads per age group, etc.).
This provides the tools needed for "smart pricing," so marketers can develop multi-tiered plans that address different users' requirements, while monetizing the portion of the traffic that was once unaccounted for without compromising customer satisfaction. With the level and accuracy of segmentation information available through MDI, marketers can fine-tune the marketing mix to improve the effectiveness of marketing for acquisition, conversion and retention.
Let us consider a couple of examples.
  • A North American operator offers an unlimited data plan for a monthly flat fee. It wants to determine if a recent increase in data usage is driven by feature phones, smartphones or dongles (broadband wireless adapters). By applying MDI, the operator is able to see consumption profiles for each type of access. It turns out that dongles account for the majority of the increase, and that a small cluster of smartphone users also tend to consume far more than the average. The operator then uses MDI to determine if there is a particular type of application that is causing the increase, and finds that P2P traffic accounts for the majority of the increase in dongle traffic, and that the majority of non-P2P users usually consume less than 500 MB per month, while the P2P users typically consume a minimum of 5 GB per month. The solution is a two-tier pricing structure for dongle users -- a 500 MB plan at US$35, and 5 GB plan at $60 per month. 
  • An Asian mobile service provider wants to increase its proportion of post-paid customers who represent a value revenue stream. It applies MDI to understand which customers are more likely to convert from pre-paid to post-paid in order to develop a targeted marketing campaign. The operator discovers that a majority of pre-paid customers who converted to post-paid were heavy users of email, so it gives priority to pre-paid customers with intensive email use for its direct marketing campaign. The result is a five-fold above average increase in take-up rates.

Beating the Revenue Gap

Mobile operators have been facing a significant challenge in meeting escalating data demands without compromising service delivery and revenues. As with any business model, however, detailed insight into consumer behavior is an essential building block to developing marketing campaigns and pricing models that optimize service delivery and customer satisfaction.
While the mobile world has some unique characteristics, the need for mobile data intelligence is becoming increasingly clear. Only through detailed metrics can operators truly realize the revenue potential of data traffic.