About Me

Monday, December 27, 2010

RIM Is Buying Sweden’s The Astonishing Tribe (TAT)

RIM, makers of BlackBerry smartphones is set to buy Sweden’s The Astonishing Tribe (TAT), which specializes in developing software that lets smart phone users personalize their device the way they like.
15% of smart phones globally use TAT software. Even Android smart phone users can make three dimensional icons on their screens by downloading this software.
It is not known yet how much RIM will pay for the Swedish software development company TAT.
Confirming the deal, David Yach, chief technology officer at RIM, said in a blog post, “Today we are pleased to confirm plans for The Astonishing Tribe (TAT) team to join Research In Motion (RIM). We’re excited that the TAT team will be joining RIM and bringing their talent to the BlackBerry PlayBook and smartphone platforms.”
He further added, “For those who don’t know, TAT is renowned for their innovative mobile user interface (UI) designs and has a long history of working with mobile and embedded technology.”
Let’s see if this latest acquisition by RIM can help BlackBerry put themselves at par with iPhone and Android phones.

Saturday, December 4, 2010

Global cloud computing mkt to be worth $30bn in 4yrs: Gartner

The global market for the 'transformational' cloud computing technology is expected to be worth over $ 30 billion (around Rs 1.35 lakh crore) in the next four years, according to a senior official at research group Gartner.


Cloud computing refers to the technology, whereby entities can share resources and software on-demand through the internet.


"It (cloud computing) is an extremely attractive technology for entities. The worldwide market for this technology is expected to be over $ 30 billion by 2014," Gartner Vice-President (Research) Milind Govekar told.


Noting that developing markets such as India have an edge in adapting to cloud computing, he said that service delivery (using this technology) would be crucial.


"Companies need to embrace many changes with cloud computing and services would go through huge transformations with this technology," he added.


Govekar noted that cloud computing would be more environment friendly and efficient than many traditional technologies.


In a recent research report, he cautioned that while using cloud computing, enterprises must curb their old habits of over provisioning infrastructure.


"(This) would result in diminished resource efficiency and environmental benefits, particularly for private cloud environments," he said.

Monday, November 29, 2010

Google Earth 6 Brings Integrated Street View And 3D Trees. Yes, Trees. 80 Million Of Them!

 

There’s an easy way to tell that Google Earth is getting so advanced that it’s getting dangerously close to looking like actual Earth: touted new features are kind of humorous. While version 4 brought the sky, and version 5 brought the oceans, now version 6 is bringing trees. Yes, trees. I fully expect version 7 to highlight the addition of dirt.

Kidding aside, the latest version is obviously the best one yet. And trees are obviously a hugely important part of the Earth. To get them into Google Earth, the search giant has made 3D models of over 50 different species of trees. And they’ve included over 80 million of them in various places around the world including Athens, Berlin, Chicago, New York City, San Francisco, and Tokyo. They’re also working with some conservation organizations to model threatened forests around the world.

The other big addition to this latest version of Google Earth is Integrated Street View. To be clear, Google has had a form of Street View in Google Earth since 2008, but now it’s fully a part of the experience. This means that you can go all the way from space, right down to Street View seamlessly. That’s because Google has included their Street View mascot/button, Pegman, in the main navigation controls now. Just like in Google Maps, you just pick him up and drop him anywhere highlighted in blue, and you’ll be taken to the detailed Street View.
And you can now fully navigate the Earth using Street View in Google Earth. Simply use your keyboard or mouse to move around.
Google Earth 6 also makes it easier to discover and explore historical imagery. This feature was added in version 5, but it wasn’t easy to find. Now you’ll be able to see when it’s available right at the bottom of the screen.

Google Earth 6 would definitely be Treebeard’s favorite version of Google Earth yet. Check out more in the pictures and videos below.

Tuesday, November 23, 2010

Indian IT to become hot spot for M&A in 2-3 yrs: Gartner

The next two to three years could see the Indian IT sector become a hot-spot for merger and acquisition (M&A) deals. That's the word from industry body Gartner, which says the 10 deals see so far this year are just the beginning.
With 10 deals in the pocket so far this year, the Indian it sector seems to have caught the consolidation bug. And Gartner says that the next two-three years will only see this number increase. 
Gartner argues that increased competition in the sector will drive players to employ the acquisition route to expansion. And deal sizes may vary widely: from 50 million deals, to deals that cross the USD 250 million mark.
Gartner adds that the most common situation will involve an IT player looking to acquire it vendors with a complete bouquet of service offerings, and not just niche services.
Partha Iyenger, VP and distinguished analyst, Gartner said, “For an English speaking market and to some extent even a European market, India is becoming the center of gravity of that global delivery story. So they are looking for acq in India of tier 2, tier 3 providers. The second is outward M&A from India where the service providers, we've had a lull coz of recession, primarily focused on mainland Europe.”
Gartner also sees greater Indian interest from Japanese players. Reports are already doing the rounds that Japanese companies like Fujitsu, NTT and Hitachi are on the prowl for stakes in mid-cap it firms in India. Gartner says that Japanese IT firms have a limited presence in India so far, and are afraid of losing clients who are interested in growing their Indian footprint. And this will spur M&A interest from Japan.
Peter Sondergaard, Senior VP - Research, Gartner said, “We believe the Japanese providers will look at acquisitions in areas that are important to them and some of the large ones do need presence in India, so that is one area we will see acquisition.”
Deals will not be restricted to cross-border ones. Consolidation among local tier-2 and tier-3 players is also expected to pick up steam. And may even overshadow partnerships in the space.

Wednesday, November 10, 2010

Microsoft SharePoint: Three Deployment Challenges

Enterprise adoption of SharePoint is rapidly on the rise: A new survey from document management company Global 360 reveals that 90% of the survey's 886 respondents currently use SharePoint, with 8% using SharePoint 2010.

Moreover, 67% of those that use SharePoint spread it out enterprise-wide, indicating that SharePoint is not just for the IT department -- it's for all departments.

The survey also highlights how SharePoint is used at organizations. It commonly starts out as a content repository but then transitions to something more dynamic. Sixty-seven percent of survey respondents have extended SharePoint's use to manage document workflows; 66% use it for portal and web content management; and 56% use it to support business processes.
The idea of using content in SharePoint to improve the business is a major theme of the survey. Of the organizations surveyed, 27% say that over half of the documents stored in SharePoint are used to support mission-critical parts of the business.

But despite widespread adoption as well as improvements in search, workflow and social networking in SharePoint 2010, the SharePoint platform does come with its own set of challenges, according to the survey results.

Out-of-the-Box User Experience Not Great
Only 17.6% of survey respondents feel SharePoint delivers a great out-of-the-box user experience and adequately meets their needs. Conversely, 78% describe SharePoint as somewhat adequate to inadequate, and that it requires additional in-house design and development.

When asked what was the biggest challenge with their SharePoint implementations, 21% of survey respondents said, "lack of an intuitive, easy-to-use interface for business users."
And an inadequate user interface usually means trouble, according to the Global 360 report: "Generic user experiences often lead to slower user adoption, lower productivity by users seeking workarounds to applications that do not meet their needs, and higher costs to rollback and customize applications."

Building Business Applications Takes Time and Effort
SharePoint, particularly SharePoint 2010, has made advances in areas such as social media, offline access and better CRM and ERP integration. But according to the Global 360 report, "the gap between what has been delivered and what can be achieved is still dramatic."

How One Company Made SharePoint 2010 More Social

The IT group at tech services company Unisys has been thinking about a social networking platform for two years now.

But some recent factors finally put a plan into action: the arrival of a new CEO two years ago who believed strongly in social networking technology and the arrival of Microsoft's SharePoint 2010 with new social features.

Another motivator for Unisys, which provides various IT services for large corporations and government agencies and has over 25,000 employees worldwide, is that employees and clients have come to expect a "Facebook for the enterprise" as more people use social media outside of work.

"Employees are expecting these social tools in the workplace," says John Knab, director of IT applications at Unisys. "Our senior leadership recognized this, and wanted to apply social tools in a way that could help the business."

Indeed, Facebook-esque features like status updates, microblogs, wikis, community pages, and the ability to tag and share content are spilling into the enterprise. It can be done through corporate microblogging site Yammer and "enterprise 2.0" social software suites from vendors such as SocialText, Jive, Atlassian and NewsGator.

All of these companies' suites stand on their own but they are also compatible with Microsoft's sprawling content management platform, SharePoint.

SharePoint 2010 Better, But Not Social Enough

Microsoft, well aware that nimbler enterprise 2.0 companies are selling social software to enterprises, added more social networking features such as wikis, blogs and tagging into SharePoint 2010, released in May.

These enhancements caught Unisys's eye, a SharePoint customer for six years, and inspired an early upgrade from SharePoint 2007 to SharePoint 2010 through Microsoft's Rapid Deployment Program that began in January and wrapped up in June.

Yet although the social enhancements in SharePoint 2010 are an improvement, Unisys felt that SharePoint's MySites -- profile pages that include social networking features -- were not quite Facebookish enough, and called on enterprise 2.0 vendor and Microsoft partner, NewsGator, to fill in the gaps with more dynamic microblogging, tagging and RSS feeds.

A True Microblogging Platform

"When you get SharePoint 2010 out of the box, it does not create real microblogging. It's just a wall that doesn't broadcast out," says Unisys Community Manager Gary Liu.

Help Improve Ubuntu on 'Bug Day'

By helping to triage reported bugs, even nontechnical users can participate in making the open source Linux software better. 

One of the great strengths of open source software is that it is continuously being scrutinized and improved by users and developers around the world.

Ubuntu, for example, has a global community of participants who are constantly working to make the Linux distribution better by contributing to the development, design, debugging, documentation, support and other aspects of work on the free and open source operating system.

Today, there's a global online event planned in which anyone can donate a little bit of their time to improving Ubuntu. It's called Ubuntu Bug Day, and it's a great opportunity for users and fans to get involved and contribute to the operating system--no training or experience required.

Bug Triage
Ubuntu Bug Days are actually regular events in the Ubuntu world, and they typically take place on a dedicated Internet Relay Chat (IRC) channel called #ubuntu-bugs.
To join a Bug Day, you'll need client software designed for IRC; many options for various operating systems are available for free download. The IRC section of Ubuntu's online documentation lists several possibilities, but if you use a recent version of Ubuntu, Empathy is the default.
On average, more than 1,800 Ubuntu bugs get reported every week, and the primary task on Ubuntu Bug Days is to "triage"--or classify--those reports so that they can be addressed as quickly as possible. Much the way emergency-room patients get triaged the minute they walk in the door so that they'll get what they need as soon as possible, so bug reports are subjected to a similar categorization process.
Individual Bug Days typically focus on triaging a specific category of bug reports, and today it's bugs for which no associated package was listed in the original report. So, those participating in today's Bug Day will be going through bug reports that don't list which software package is affected, and then adding that information. Once that's done, the bug reports can be forwarded on to the appropriate place for fixing.
Open Source's 'Killer App'

Tuesday, November 9, 2010

5 Keys for Full Recovery in the Cloud

The cloud is a natural solution for disaster recovery, but careful consideration must be given before entrusting your data to a sky-high backup repository. Can you recover workloads from the cloud? How well does it scale? What's the nature of its billing system? Is its infrastructure secure? And will it offer complete protection?

While cloud computing is a familiar term, its definitions can vary greatly. So when it comes to online backup, the cloud is an important feature that can play a large role in securing and protecting during a disaster, which I like to refer to as "cloud recovery."
In order to be worthy of this cloud recovery title, a solution should have the following five features, which I have outlined below. 

1. Recover Workloads in the Cloud

There is an old saying in the data protection business that the whole point of backing up is preparing to restore. Having a backup copy of your data is important, but it takes more than a pile of tapes (or an online account) to restore. You might need a replacement server, new storage, and maybe even a new data center, depending on what went wrong.
The traditional solutions to this need are to either keep spare servers in a disaster recovery data center or suffer the downtime while you order and configure new equipment. With a cloud recovery solution, you don't want just your data in the cloud -- you want the ability to actually start up applications and use them, no matter what went wrong in your environment.

2. Unlimited Scalability

If you were buying disaster recovery servers for yourself, you would have to buy one for each of your critical production servers. The whole point of recovering to the cloud is that they already have plenty of servers.
The ideal cloud recovery solution won't charge you for those servers up front but is sure to have as much capacity as you need, when you need it. Under this model, your costs are much lower than building it yourself, because you get the benefit of duplicating your environment without the cost.

3. Pay-Per-Use Billing

I love pay-as-you-go business models because they force the vendor to have a good product. Plus, this make the buying decision much easier -- just sign up for a month or two (or six), and see how it goes.
Removing the up-front price and long-term commitment shifts the risk away from the customer and onto the vendor. The vendor just has to keep the quality up to keep customers loyal.
We also know that data centers are more cost-efficient at larger scale, especially the management effort, and they require constant improvement. In your own data center, you might have some custom configurations, but in the data recovery data center, you just need racks, stacks of servers, power and cooling. You are much better off paying a monthly fee to someone who specializes.

4. Secure and Reliable Infrastructure

Lots of people like to bash cloud providers for security and reliability, but I think they hold the providers to the wrong standard. Although it is fine, in the abstract, to point out all the places where cloud providers don't achieve perfection in security and reliability, as a customer evaluating a cloud vendor, it seems better to compare them to your own capabilities.
I believe that most of the major cloud providers' infrastructures are more secure and more reliable than those of most private data centers. The point is that security and reliability are hard, but they are easier at scale. Having control over your own data center isn't enough -- you also have to spend the money to buy the necessary equipment, software, and expertise. For most companies, infrastructure is a necessary evil. Companies like Amazon and Rackspace do infrastructure for a living, they and do it at huge scale. Sure, Amazon's outages get reported in news, but do you think you can outperform them over the next couple of years?

5. Complete Protection

Remember the "preparing to restore" line? For me, it really comes home in this idea of complete protection. If your backup product asks you what you want to protect, I am already suspicious. My vote is, "get it all." I see lots of online products offering 20GB plans, and to me, they look like an accident waiting to happen. I don't want to know which files I need to protect -- I want to click "start" and know that any time I want, I can click "recover", and there won't be any "please insert your original disk" issues.
The places people normally get bitten by this are with databases (do you have the right agent?), configuration changes (patched your server, or added a new directory of files?), and weird applications (the one that a consultant set up, and you don't really understand how it works). Complete protection means that all of these things can be protected without requiring an expert in either your own systems, or with the cloud recovery solution.

Technology Is Only Part of Disaster Recovery Planning

Having the right technology in place is only one part of an effective disaster recovery plan. What's too often overlooked are people and facilities. If a disaster should strike, it's vital that important data can be saved, but complete recovery planning will take into account that people will need to go to their homes or to other offices and facilities to resume normal operations.
 
Technology Is Only Part of Disaster Recovery Planning Every server must be accounted for, protected, backed up and ready to be brought back online if they lose the physical site that hosts the production system. The problem is that this approach leaves out two-thirds of the total DR planning that the modern organization must do in order to survive a site disaster. Technology is not a small part of your IT DR planning, but it is only one part.
When approaching DR planning, think of it as a trinity of concerns. First you have the technology, which most companies plan for in some way. Secondly, you have people. Your employees that use the data systems have to be taken into consideration. Third, you have facilities -- after all, you do need someplace to house the technology and personnel. 

Technology and People

Technological resiliency is a theory that IT shops deal with daily. They know how the servers are backed up or replicated (most times, both). They know how they'll perform which operations during a restoration and/or failover procedure. While testing DR technology plans still happens woefully infrequently, the planning itself is handled in the majority of businesses these days.
People, on the other hand, are often ignored. Companies plan how all their vital data systems will fail over within minutes to another location, but don't know what to do with the employees who are sitting in the same building as the failed servers. Even if it was something as simple as a bandwidth failure that caused the failover to happen, the users who would normally connect over that same bandwidth are now twiddling their thumbs.

Complete DR planning will take into account that people will need to go to their homes or to other offices and facilities to resume normal operations. They may need VPN connectivity or remote desktop (terminal services) systems in place to allow them to access their applications when their desktops are no long accessible.

They'll also need some method to keep connected to the DR planners so they can be alerted as to where to go and what to do in order to get back online. How will you send a company-wide email when no one in the company can access the email systems? Smartphones may help, but most organizations don't use smartphones for every employee impacted by the disaster. Telephone lists, websites and other tools can help get the word out and get personnel where they need to be. Just make sure that employees are trained on where to look for information well in advance of any disaster.

Whatcha Gonna Do?

Speaking of getting people where they need to be, ensuring that they have somewhere to be is a critical part of the plan. If your DR plan calls for servers to be brought up in a hosted facility, where will your users sit to access those systems? Do you have other offices, or can you rent space at a temporary facility? Where will your clients come to do business with your company, and how will the find you?

While many businesses have embraced the digital age, there are far more who cannot do all of their business virtually. Temporary office space, call centers, phone lines and fax facilities must be planned for well in advance of a disaster. You might also need to arrange transportation and even temporary lodging if critical employees will need to travel in order to reach this new facility. Keep in mind that a single method of travel is just as much of a single point of failure for your DR plan as a single server is.

As you can see, IT DR planning is a large component of a well-rounded DR plan, but it is an invitation for creating a secondary disaster for your business if it is done alone. A true and complete DR plan will allow for people, facilities and technology (think "People, Places and Things") before you're done. Plan for all three effectively, and you will be able to handle disasters on many different levels without fail.

2011 IT Salaries: How Much Are You Really Worth?

Datamation's annual IT Salary Guide lists salaries for various IT professionals, from software developers to IT systems administrators. Additionally, it lists salary increases for specialty IT skill sets. Curious what you're worth on the open market? Find out what you can expect to earn, should you decide to look for a new job in the new year.

The good news: The The 2011 IT Salary Guide demonstrates that IT salary levels are once again headed upward.

The 2011 IT Salary Guide indicates some respectable boosts in average salary. Lead applications developers get a 4.7% increase and software engineers enjoy a 4.1% boost in paycheck. Okay, so IT managers move up a ho-hum 2.5% and project managers rise 2.8%, but Web developers jump 4.6% and System Security Admins levitate a healthy 4% (seems like security is always hot).

The bad news: these rising IT salary numbers are merely compensating for last year's 2 percent to 4 percent decreases. So alas, the net effect is that IT salaries are essentially flat over the last year or so. Yes, times are tough.


Below is salary info for a sys admin position, one of many positions for which data has been gathered and published. To see the other positions, check out the article for complete survey results.

Systems Administrator

2011 average salary range: $53,250 - $83,000.
  • The 2011 salary range is an increase of 3.6 percent over this job's 2010 salary range, which was $51,250 to $80,250.
  • The 2010 salary range was a 2.8 percent decrease under this job's 2009 salary range, which was $52,750 to $82,500.
  • The 2009 salary range is an increase of 3.6 percent over this job's 2008 salary range, which was $51,750 to $78,750.
Add a salary increase for the following skills:
  • Add 6 percent for Basis administration skills (down from 7 percent in 2010)
  • Add 9 percent for Cisco Network administration skills (no change from 2010; down from 12 percent in 2009)
  • Add 8 percent for Linux/Unix administration skills (no change from 2010; down from 10 percent in 2009)
  • Add 8 percent for virtualization skills (up from 7 percent in 2010)
  • Add 4 percent for Windows 2000/Windows 2003/XP/Vista skills (down from 6 percent in 2010; down from 8 percent in 2009)
  • Add 6 percent for Windows 7 skills (down from 7 percent in 2010)
Note: Since these numbers are national averages, they must also be adjusted based on your area of the country.

Salary levels are approximately 10 percent to 30 percent higher in the North East; about average in the South Atlantic (Florida to Delaware); average to modestly lower in the Midwest, Mountain west, and South; and 5 percent to 30 percent higher on the West coast.
IT salaries in large metropolitan areas are higher than the national average. For example, in the following cities they are:

City
Deviation
Boston, Mass. 30% higher
Stamford, Conn. 31% higher
New York, N.Y. 41% higher
Washington, D.C. 30% higher
Philadelphia, Penn. 15% higher
Atlanta, Ga. 10% higher
Miami, Fla. 10% higher
Chicago, Ill. 23% higher
Dallas/Houston, Texas 3% to 5% higher
Irvine, Calif. 24% higher
Los Angeles, Calif. 24% higher
San Diego, Calif. 14% higher
San Francisco, Calif. 35% higher
San Jose, Calif. 32% higher
Seattle, Wash. 18% higher

Monday, November 8, 2010

All-in-One Web Browser and Social Networking Tool

RockMelt is a web browser that has given as much importance to social networking as surfing the worldwide web. It is touted to have been developed from scratch based on how things are on the internet.
  
rockmelt new web browser
rockmelt new web browser

For the past two years, the browser was secretly developed which has spawned numerous assumptions especially with one of its major supporters, Marc Andreessen, who is recognized as the creator of the pioneering web browser in the world.

At this point, RockMelt is allowing selected users a chance to test the browser. Social networking has been integrated into the web browser which utilizes cloud technology to allow users to retrieve favorites from any PC the user wants to use. 

It will be necessary to sign in before RockMelt and its social networking function can be utilized. This function permits users to easily access their social networking accounts on Facebook or Twitter through the web browser itself without going to individual website.

A drop down index provides results of the search, which will be pre-loaded by the browser to enhance the loading speed of the page.
Google is the force behind the search element of the browser, which is derived from the Chromium.

The reaction of typical users on RockMelt’s version of the web browser and its integrated social networking tool is something worthy of note, comparable to another browser that was launched years ago but captured less than one percent of the market. With giants such as Microsoft, Apple, Mozilla and Google taking up most of the market, RockMelt will have its hands full when it enters the browser market.

However, the company and its innovative web browser and integrated social networking function also has some strong benefactors, namely Diane Green of VMware, Bill Campbell of Apple, and the venture capital company of Andreessen.

Thursday, October 21, 2010

Challenges in Testing of mobile applications

The way we access the net has changed drastically in the past few years. Today a person can manage his documents using Google docs, find the shortest route to that important client meeting using Ovi Maps, sift through e-mails on his Yahoo account using his Opera mini browser, check out what his friends are up to using his Facebook widget, chat with his friends using Meebo; without even being near to his laptop. 

Welcome to the world of Mobile web. Mobile applications today have enabled us to literally have all the information we need at our fingertips. These applications are now integrating with our banks, hospitals and workplaces and are set to become an inseparable part of our lives.
According to Gartner, the worldwide mobile applications market is currently estimated to be around $6.2 billion. In 2013, this market is expected to be around $21.6 billion. It is nobody’s guess that this market has huge untapped potential. For the companies that are involved in creating and testing mobile-based applications, this is the tipping point.

However mobile applications need to be carefully developed and tested, as unlike a computer, the safety infrastructure of a mobile device may not be comparatively impenetrable. Recent doubts regarding Citibank’s Mobile Banking Application’s security are a case in point. Citibank released its mobile application for iPhone in March 2009, using which customers could keep a track of their account. In July 2010, Citibank released an updated version and asked the users to update to this as the earlier version had a security flaw that rendered critical account information vulnerable to attacks. 

For developing a deeper understanding of how mobile applications need to be tested, we need to appreciate the fact that mobile devices is a new market altogether and has its own dedicated concerns. These concerns arise from a host of unique features that make mobile device market different from PC’s. These features are:

Multiplicity of mobile platforms
Multiplicity of mobile platforms is one of the biggest challenges that the industry faces today. The handset manufacturers are using vastly different platforms like Symbian, Memo, Bada, Android, J2ME, Blackberry and Brew. An application needs to be tested on all of them to ensure that it works for everyone. If a company plans to test for only a few operating platforms, it runs the risk of locking out potential customers who might be using other platforms.

Multiplicity of Cell phone Models
Every handset maker offers hundreds of models to its customers. These models have different screen sizes, hardware configurations and image rendering capabilities.  An exhaustive test conducted across these devices is the only way to ensure that an application works on all cell phones. Otherwise, your application would become like one of those mobile anti-viruses that take up so much of memory that a user cannot do anything else, thus defeating the very purpose they were installed for, to let the user surf the net safely. However, an exhaustive test can prove to be a very expensive and time-consuming process. Testing process should focus on careful selection of the sample and ensure that an application delivers optimum performance for all desired configurations of hardware. 

Different Carriers
There are over 400 mobile operators across the world offering GSM, CDMA and some lesser-known local networking standards. Each of these operators, to steer clear of the technicalities, does things in a different way. There are a host of issues involved in enabling an application for use on different carriers. Each network provider has implemented systems that behave slightly differently from the other as they use technology from different vendors. Therefore, an Airtel network is different from a Vodafone one and both require tackling slightly different challenges. Also, network providers insert web proxies to dictate the transfer of information on their system. ‘Walled garden’ approach, where users are allowed to access only a few pre decided websites is an example of data control by the operators. Another technique worth a mention here is trans-coding. Transcoding scales down fixed web content to ensure it fits the mobile phone screens. With different carrier requirements, testing needs to focus on all target carriers and ensure that the application functions on all of them.

Location
Location is another major constraint in testing mobile applications. An application should ensure that it does not eat bandwith or fails to function when signals are low. For e.g. with A-GPS devices, that use cell phone tower triangulation to identify your location, sometimes low signals result in a loss of data. Suddenly a passenger is left wondering where to turn next. Therefore, these applications need to be tested for their capability to work with low signals also. In fact, to completely test a network, the tester needs to be fully connected to the target network. It means that you need to be in China to test on China Mobile and in India to test on BSNL.
Summed up, these challenges make independent testing of mobile applications a complex and expensive affair. However, herein lies the opportunity for today’s entrepreneurs to devise new ways for addressing these issues. Emerging technologies like cloud computing can go a long way in addressing the challenges of testing mobile applications. Creation of tester communities that can test an application for particular geographies can be the next big idea for this industry.
The key is to constantly innovate and identify new business processes that maximize revenues and cut costs without compromising on service requirements.

Investment in open source software set to rise

The open source software market has reached a turning point, with organizations in the United States, United Kingdom and Ireland now committing to clear strategies and policies for open source software development. 

According to the findings of a survey released, more than two-thirds of organizations (69 percent) anticipate increased investment in 2010, with more than a third (38 percent) expecting to migrate mission-critical software to open source in the next twelve months.
The survey of 300 large organizations in both the private and public sector found that half of the respondents (50 percent) are fully committed to open source in their business while almost a third (28 percent) say they are experimenting with open source and keeping an open mind to using it.
Furthermore, two-thirds of all respondents (65 percent) noted that they have a fully documented strategic approach for using open source in their business, while another third (32 percent) are developing a strategic plan. Of the organizations using open source, almost nine out of ten (88 percent) will increase their investment in the software in 2010 compared to 2009.

Through both research and work with clients, we can see an increase in demand for open source based on quality, reliability and speed, not just cost savings. This is a significant change from just two years ago when uptake was driven mainly by cost savings. We can expect to see this trend develop as open source continues to evolve and address even more business critical functions.

When it comes to the benefits of open source, the cost was no longer viewed as the key benefit, with respondents focusing instead on other aspects:
* 76 percent of respondents in the UK and US cited quality as a key benefit of open source
* Two-thirds overall (70 percent) cited improved reliability
* Better security/bug fixing was cited by nearly as many (69 percent) across both countries. 

Cost control with open source
Although cost savings are not the primary driver for open source adoption, half of the respondents (50 percent) do cite open source as contributing to an overall lower total cost of ownership.
When asked about the greatest cost savings in open source, the vast majority of organizations surveyed believe they can be made on software maintenance costs (71 percent), initial software development time (33 percent) and initial development costs (33 percent).
Open source software development on the rise but companies still not so open to sharing
The volume of open source software development is set to rise over the next three years. In 2009, 20 percent of software developments were in open source. This is expected to rise marginally to 23 percent in 2010 and to 27 percent by 2013. 

One notable finding, however, is that less than a third (29 percent) are willing to contribute their own solutions back to the community.

Top 10 strategic technologies for 2011

Gartner, Inc. highlighted the top 10 technologies and trends that will be strategic for most organizations in 2011. The analysts presented their findings during Gartner Symposium/ITxpo, being held through October 21.
Gartner defines a strategic technology as one with the potential for significant impact on the enterprise in the next three years. Factors that denote significant impact include a high potential for disruption to IT or the business, the need for a major dollar investment, or the risk of being late to adopt.
A strategic technology may be an existing technology that has matured and/or become suitable for a wider range of uses. It may also be an emerging technology that offers an opportunity for strategic business advantage for early adopters or with potential for significant market disruption in the next five years.   As such, these technologies impact the organization's long-term plans, programs and initiatives.
“Companies should factor these top 10 technologies in their strategic planning process by asking key questions and making deliberate decisions about them during the next two years,” said David Cearley, vice president and distinguished analyst at Gartner.
“Sometimes the decision will be to do nothing with a particular technology,” said Carl Claunch, vice president and distinguished analyst at Gartner. “In other cases, it will be to continue investing in the technology at the current rate. In still other cases, the decision may be to test or more aggressively deploy the technology.”

The top 10 strategic technologies for 2011 include:

Cloud Computing.
Cloud computing services exist along a spectrum from open public to closed private. The next three years will see the delivery of a range of cloud service approaches that fall between these two extremes. Vendors will offer packaged private cloud implementations that deliver the vendor's public cloud service technologies (software and/or hardware) and methodologies (i.e., best practices to build and run the service) in a form that can be implemented inside the consumer's enterprise. Many will also offer management services to remotely manage the cloud service implementation. Gartner expects large enterprises to have a dynamic sourcing team in place by 2012 that is responsible for ongoing cloudsourcing decisions and management.

Mobile Applications and Media Tablets.
Gartner estimates that by the end of 2010, 1.2 billion people will carry handsets capable of rich, mobile commerce providing an ideal environment for the convergence of mobility and the Web. Mobile devices are becoming computers in their own right, with an astounding amount of processing ability and bandwidth. There are already hundreds of thousands of applications for platforms like the Apple iPhone, in spite of the limited market (only for the one platform) and need for unique coding.
The quality of the experience of applications on these devices, which can apply location, motion and other context in their behavior, is leading customers to interact with companies preferentially through mobile devices. This has lead to a race to push out applications as a competitive tool to improve relationships and gain advantage over competitors whose interfaces are purely browser-based.

Social Communications and Collaboration. 
Social media can be divided into: (1) Social networking, social profile management products, such as MySpace, Facebook, LinkedIn and Friendster as well as social networking analysis (SNA) technologies that employ algorithms to understand and utilize human relationships for the discovery of people and expertise. (2) Social collaboration, technologies, such as wikis, blogs, instant messaging, collaborative office, and crowdsourcing. (3) Social publishing, technologies that assist communities in pooling individual content into a usable and community accessible content repository such as YouTube and flickr. (4) Social feedback - gaining feedback and opinion from the community on specific items as witnessed on YouTube, flickr, Digg, Del.icio.us, and Amazon.  Gartner predicts that by 2016, social technologies will be integrated with most business applications. Companies should bring together their social CRM, internal communications and collaboration, and public social site initiatives into a coordinated strategy.

Video. 
Video is not a new media form, but its use as a standard media type used in non-media companies is expanding rapidly. Technology trends in digital photography, consumer electronics, the web, social software, unified communications, digital and Internet-based television and mobile computing are all reaching critical tipping points that bring video into the mainstream. Over the next three years Gartner believes that video will become a commonplace content type and interaction model for most users, and by 2013, more than 25 percent of the content that workers see in a day will be dominated by pictures, video or audio.

Next Generation Analytics.
Increasing compute capabilities of computers including mobile devices along with improving connectivity are enabling a shift in how businesses support operational decisions. It is becoming possible to run simulations or models to predict the future outcome, rather than to simply provide backward looking data about past interactions, and to do these predictions in real-time to support each individual business action. While this may require significant changes to existing operational and business intelligence infrastructure, the potential exists to unlock significant improvements in business results and other success rates.

Social Analytics.
Social analytics describes the process of measuring, analyzing and interpreting the results of interactions and associations among people, topics and ideas.. These interactions may occur on social software applications used in the workplace, in internally or externally facing communities or on the social web. Social analytics is an umbrella term that includes a number of specialized analysis techniques such as social filtering, social-network analysis, sentiment analysis and social-media analytics. Social network analysis tools are useful for examining social structure and interdependencies as well as the work patterns of individuals, groups or organizations. Social network analysis involves collecting data from multiple sources, identifying relationships, and evaluating the impact, quality or effectiveness of a relationship.

Context-Aware Computing.
Context-aware computing centers on the concept of using information about an end user or object’s environment, activities connections and preferences to improve the quality of interaction with that end user. The end user may be a customer, business partner or employee. A contextually aware system anticipates the user's needs and proactively serves up the most appropriate and customized content, product or service. Gartner predicts that by 2013, more than half of Fortune 500 companies will have context-aware computing initiatives and by 2016, one-third of worldwide mobile consumer marketing will be context-awareness-based.

Storage Class Memory.
Gartner sees huge use of flash memory in consumer devices, entertainment equipment and other embedded IT systems. It also offers a new layer of the storage hierarchy in servers and client computers that has key advantages — space, heat, performance and ruggedness among them. Unlike RAM, the main memory in servers and PCs, flash memory is persistent even when power is removed. In that way, it looks more like disk drives where information is placed and must survive power-downs and reboots. Given the cost premium, simply building solid state disk drives from flash will tie up that valuable space on all the data in a file or entire volume, while a new explicitly addressed layer, not part of the file system, permits targeted placement of only the high-leverage items of information that need to experience the mix of performance and persistence available with flash memory. 

Ubiquitous Computing. 
The work of Mark Weiser and other researchers at Xerox's PARC paints a picture of the coming third wave of computing where computers are invisibly embedded into the world. As computers proliferate and as everyday objects are given the ability to communicate with RFID tags and their successors, networks will approach and surpass the scale that can be managed in traditional centralized ways. This leads to the important trend of imbuing computing systems into operational technology, whether done as calming technology or explicitly managed and integrated with IT. In addition, it gives us important guidance on what to expect with proliferating personal devices, the effect of consumerization on IT decisions, and the necessary capabilities that will be driven by the pressure of rapid inflation in the number of computers for each person.

Fabric-Based Infrastructure and Computers. 
A fabric-based computer is a modular form of computing where a system can be aggregated from separate building-block modules connected over a fabric or switched backplane. In its basic form, a fabric-based computer comprises a separate processor, memory, I/O, and offload modules (GPU, NPU, etc.) that are connected to a switched interconnect and, importantly, the software required to configure and manage the resulting system(s). The fabric-based infrastructure (FBI) model abstracts physical resources — processor cores, network bandwidth and links and storage — into pools of resources that are managed by the Fabric Resource Pool Manager (FRPM), software functionality. The FRPM in turn is driven by the Real Time Infrastructure (RTI) Service Governor software component. An FBI can be supplied by a single vendor or by a group of vendors working closely together, or by an integrator — internal or external.

Farmers Urged To Embrace Technology

 


African farmers have been urged to use technology to boost food production on the continent. Low levels of agricultural productivity over the years have contributed to the recurrent food shortage that affects over 30 per cent of the population, representing about 260 million people.

This problem can be addressed if farmers equip themselves with up-to-date agronomic knowledge, embrace technologies that promote adequate fertilizer application as well as use improved seeds that guarantees high yields.

There are plans to boost food production in tropical Africa and Pedro Sanchez, Director of Tropical Agriculture and Rural Environment at the Columbia University, New York believes that it is possible to save many hungry people in Africa.

Interacting with visiting international journalists at a seminar on ‘Development and poverty reduction’ organized by the German based Capacity building institution- Inwent and the Initiative for Policy Dialogue of the Columbia University, New York, Mr Sanchez, who is also the Director of the Millennium Villages Project at the Earth Institute of the university, said research in many African countries had shown that the soil lacks nitrogen which could be replenished through prudent use of fertilizers.

As a solution, agronomists and soil scientists are collaborating with Google to use the latest technology for a digital soil mapping that would help farmers around the world.

The new technology, according to the Director of the African Soils information Services (AfSIS), is developing the digital soil map of the world, which would allow farmers to test the soil to ascertain the level of nitrogen required by their lands through the use of mobile phones.

“Just about everywhere in African there are mobile phones and through the soil mapping project, farmers can send questions on how much fertilizer they can apply to the soil and get answers through SMS.”

He said lack of a subsidy program, inaccessibility to farm inputs by farmers and poor extension services can be named as the underling factor behind the failure of the continent, which has the potential to produce enough food.

“It is essential that the extension services of the agricultural sector are equipped with the needed logistics.” The senior research scholar also dismissed claims that organic farming in Africa was not feasible because of its depleted soil.

He noted that it is possible to promote the high yielding hybrid seeds in Africa, explaining that the promotion of genetically modified organisms (GMOs) poses a new threat.

In most parts of Africa, Civil Society Organisations and other non-governmental organizations are against the introduction of GMOs, claiming that they pose as a threat to human lives and the environment.

“There is no scientific evidence of any damage to human health or the environment, for instance some GMOs like B cotton grown largely in South Africa have positive environmental effect as it reduces pesticide application,” he stated.

Two different research bodies in Europe have indicated that there are only simple agronomic problems related to GMOs. “It is not a scientific argument any more but a political argument. Burkina Faso had the political courage and is now a major producer of cotton in West Africa and they are doing well.

Cartoonist maps world of social networking


A cartoonist has created a map which imagines websites as countries and shows their size based on how popular they are.
Randall Munroe, a university graduate from Massachusetts, US, has used his imagined world map to represent the levels of social activity in online communities such as Facebook, Twitter and Skype.
Munroe is best known as the best known as the creator of the cult webcomic xkcd.
A 
map created by Randall Munroe that imagines websites as countries and 
shows their size based on how popular they are.
The land mass of each mythical country named after a website equates to the popularity of that site, showing effectively how social activity is spread throughout the internet.
Munroe based his Map of Online Communities on statistical information, including website hits and the number of members each community had over the Summer of 2010, reports the Telegraph. The new map is an updated version of a map that Munroe created in 2007.
Email dominates the map and unsurprisingly Facebook features prominently, as does Twitter and Skype.
The map does throw up some surprises. Myspace, once one of the most popular social networking sites, is barely visible, only slightly larger than LinkedIn, a site which aims to connect people through business profiles.
Farmville and Happy Farm sit prominently while YouTube, the video-sharing website, is a good-sized island.
The most surprising inclusion for many is QQ, a Chinese instant messaging service which has more than 100 million users but is virtually unheard of in the west.
Munroe said: "This update map uses size to represent total social activity in a community - that is, how much talking, playing, sharing or other socialising happens there. This meant some comparing of apples and oranges, but I did my best to be consistent."

Wednesday, October 20, 2010

Why Thought Leadership Rules the B2B World

Top B2B marketers understand two key objectives 1) Building new business and 2) Becoming a thought leader. 56% of B2B marketing executives stated “positioning our company as a thought leader” as their top objective in a recent Economics Intelligence Unit study. In addition, a third of respondents felt thought leadership would be the best way for providers to market services in next 3-5 years.
But why is thought leadership dominating the B2B marketing world? In a world of generating leads and CRM, does the answer to successful B2B marketing lie within the broader control of your company in addition to the collaboration of your marketing and sales forces? Here are some things to consider.
  • You are no longer the hunter, you are now being hunted –The web has enabled top executives to gather their own information via references, social networks and search engines, as opposed to tapping the shoulder of the corporate librarian. A Forbes and Google survey shows 64% of senior executives are clicking “search” more than six times a day seeking business information. It wasn’t so long ago when B2B marketers were doing the hunting. But now, thanks to the web, they are instead being hunted. This puts huge importance on being a thought leader. Compelling, thought leadership information that resonates with a niche will increase your digital marketing KPIs ahead of competitors.
  • The thought leadership & social media marriage – B2B social media marketing builds brands. It creates momentum for awareness, loyalty and equity while strengthening lead generation efforts. Members of social networks (followers, fans, etc.) are always on the prowl for good, credible information. With 69% of B2B buyers using social networks, the opportunity to share relevant content is evident. B2B buyers and marketers are turning to social media to gather their resources, and while there, looking for industry thought leaders to share information. It’s a transitive property: If social media is dominating B2B marketing, then thought leadership is dominating B2B marketing.
  • Customers want to know “how” Prior to the widespread adoption of social media, public company information mainly consisted of the “what” a company did, and not “how” the company did it. Today, however, customers want to attach a personality to a company and, more importantly, they want to know how companies differ. Thought leadership enables customers to separate companies into the who’s who of the industry. They’re better able to understand a company’s personality and ultimately understand how processes and strategies work.
Thought leadership is continuing to gain momentum and is expected to be a huge driving force of B2B marketing strategies. Understanding thought leadership is the first step in the process. The next step consists of understanding how to effectively and successfully become the thought leader of your industry.

Tuesday, October 19, 2010

India to spend $2.3 trillion to go green

India to spend $2.3 trillion to boost its energy sector by 2030, which includes a substantial burden for expanding the country's energy basket to include green sources such as solar, wind and nuclear power.

It indents to improve energy efficiency and using clean technology to help Asia's third largest economy balance growth and environmental aims. B.K. Chaturvedi, a member of Planning Commission that charts India's growth path, recently said that being the world's third-worst carbon emitter it is essential for the country to shift to a greener economy.
   
Last year the country set a goal for slowing the growth of its emissions, saying it will try to rein in its 'carbon intensity' the amount of carbon dioxide emitted per unit of GDP by between 20 and 25 percent by 2020, from 2005 levels.

Power remains a top priority in terms of increasing energy efficiency and use of renewables as well. India aims to add about 100 gigawatts (Gw) of power generation capacity by early 2017, much of it from coal, despite conceding it would miss by 79 percent an earlier five-year target of adding 78.7 Gw by March 2012. But India will need to keep burning cheaper fossil fuel to expand the reach of electricity to half of its one-billion-plus population without power.


"We should develop this in the context of a two-pronged strategy: The first is improving energy efficiency, and the second is changing the mix of the energy which we consume. Some of it will be towards energy consumption, but a lot of it will go towards improving energy efficiency and improving the composition of energy," he added. The portion of the amount would be spent on making the shift to a green economy in the next two decades is so far not mentioned.

Smart Gadgets Go Green

Saving the planet one gadget at a time

AlertMe iPhone app
New iPhone apps connect with smart meters to tell you your energy use
Saving energy in a home full of gadgets can seem impossible but the latest home energy technology can help you save the planet - and money.

Gadgets such as smart meters are part of the government's plan to cut carbon emissions. Due in your home sometime after 2013, they show you exactly what electricity you are using, and how much it is costing.
They will also communicate with your supplier, automatically taking readings and making switching suppliers much simpler.
But you do not have to wait until the end of the decade to start cutting your consumption.
The Wattson is a simplified smart meter. A sensor clipped onto your fusebox monitors the electromagnetic field in the wires coming from it, and the accompanying transmitter sends the information, wirelessly, to the shiny Wattson box.
It shows how much power you are using right now, and how much that would cost if - rather unrealistically - you continued to keep everything on all day, every day of the year.
It really does show how switching on the kettle, the oven and any other home appliance sends your wattage sky high.
Smart meters are not magic - they are a tool and you have got to use them intelligently
Dr Sarah Darby
Research Fellow, Oxford University
The accompanying software, Holmes, is more useful. It can show usage over time, and settings can be tweaked for a particular energy tariff.
A similar system is the AlertMe Energy Home Hub, which also uses a plug and transmitter.
It connects to the internet using your router, and gives you access to your home's energy statistics via an online dashboard, a handheld controller or iPhone app.
You can get a summary on your iGoogle page by way of the Powermeter widget and the hub also talks to smart plugs, which can be switched on or off remotely. 

Magic meter

Smart meters might keep you informed but actually saving the energy - and money - is still down to you.
"Smart meters are not magic. They are a tool and they are like every other tool - you have got to use it and use it intelligently," said Sarah Darby, from the Environmental Change Institute at Oxford University.
"You can take a smart meter round the house, switching stuff on and off, to see what difference it makes to how much [energy] you are using.
"That way, you can see which [devices] are your big users and which are your small users and before you go to bed at night, you can see what is still switched on."
But if smart metering seems like too much effort, one option might be the PassivEnergy management system, which aims to cut your bills with hardly any human intervention.
The system takes over your central heating and hot water controls, using in-room thermometers, a wireless hub, and a new central heating controller to manage your energy use more efficiently.
PassivEnergy controller
PassivEnergy takes control of your central heating and hot water system
Initially, you use the handheld controller to tell the system about your normal routine - how warm you like it, when you are usually in and out, when you go to bed and when you are planning to go on holiday.
Then it starts to watch what you do and learn your habits.
Director of market intelligence for PassivSystems, Wayne Muncaster, says this is the clever part of the system.
"You can tell it when you are in, when you are out, and the hot water you believe you need.
"But what the system will learn over time is what your habits are and what you actually do - how many showers you take in the morning, whether you have a bath in the middle of the day. The system will begin to understand your lifestyle habits.
"That means the system will only fire, and actually burn energy when you actually need it, not when you tell it you need it or when you think you may need it, but when you actually begin to use the system."
If you come home unusually early, you can override your settings by pressing the "occupancy" buttons to flip the system from "out" to "in".
And if you decide not to come home at all, you can remotely control the system using an iPhone app.
It is not cheap - the complete kit costs nearly £600, with an annual fee of £20 or £30, although the company says customers can expect to save that amount on their bills in two or three years. 

Smart fridge

Using technology to save energy in your own home can save you money today, but it could also change the way we power the country tomorrow.
Currently, the national electricity suppliers need to know at any one time how much electricity the country needs, and exactly match it. If they generate too much or too little electricity, the grid will fail.
Electrocity pylons
Smart fridges adapt their electricity use to fluctuations in the national grid
The current solution is to keep several coal-fired power stations running at 50% capacity, so they can be throttled up or down at a moments notice.
It is a very inefficient, wasteful thing to do, but it is the only way to be ready for unexpected surges or lulls in demand.
Now though, clean technology company RLtec thinks it has a solution - the smart fridge.
As commercial manager Joe Warren explained, the smart fridge monitors its internal temperature as well as the status of the national grid - and then compensates accordingly to balance the grid.
"If the grid needs some assistance - if there is too much or too little electricity in the grid - and if the food is at the right temperature, the fridge can turn its motor off or on earlier than it otherwise would have done."
So far the smart fridge is just part of a trial roll-out by energy supplier nPower - balancing the national grid will of course require many, many more people to buy them.
The world may slowly be getting its head around smarter energy and governments are getting on board.
But for at least the next few years - until the devices become mandatory and affordable to all - smarter energy consumption will remain an option only to those who are willing, and able to pay for it.