Migrating Your Data and Applications to the Cloud

cloud-migration
Cloud computing is intended to reduce the expenses of IT organizations by lowering capital expenditure by allowing them to purchase only the required amount of computing and storage resources. Today, due to the enormous advantages of cloud computing, many organizations are exploring how the cloud could be leveraged to make their enterprise applications available on an on-demand basis.

In the last few years, thousands of companies moved to the cloud, through public, private, or hybrid cloud offerings. Many others are considering moving to the cloud due to its enormous advantages.

Before you move to the cloud, it is important to look at the major advantages of cloud computing.

When it comes to migration to the cloud, you can take advantage of Microsoft’s Windows Azure, Google Cloud, Amazon AWS, Citrix, etc. Companies use these platforms to build websites, web apps, mobile apps, media solutions, etc.

Migrating to the cloud, you can potentially create more business profits by taking risks and encouraging experimentation. While risk taking in the past requires you to invest a lot in hardware and software, the cloud allows you to create an application on a completely scalable platform and get it out in the form of a service rather than selling licenses.

Although these advantages are there, migration may not be an easy task. For instance, enterprise applications are faced with strict requirements in terms of performance, service uptime, etc. Migrating them to the cloud requires you to analyze all these requirements very closely and come up with an in-depth migration plan that increases ROI.

Hardware resources required can be greatly minimized by cloud migration. Since pooled resources are better utilized, moving to the public cloud can dramatically decrease the need for in-house servers. This will also reduce physical floor space and power consumption. In addition, as mentioned above, migration will surely reduce operational and management costs. A number of solution and service providers in the cloud market can help you easily migrate at reduced cost structure. MSys has also been providing the same type of migration service for years.

Things to Check

An important thing to consider while migrating to the cloud is analyzing the changes required in the architecture of the application being migrated. In many cases, the application must undergo a complete architecture change to be fit for the cloud. A service-oriented application works well with the abstraction of cloud services through application programming interfaces (APIs).

Additionally, you should also seek whether the application needs to be altered to take advantage of the native cloud features. Direct access to elastic storage, management of interfaces, and auto-provisioning services are some of these cloud features you may want to take advantage of.

Migration Roadmap

During the transition, you should also ensure that the level of service provided in the cloud is comparable or better than the service provided by traditional technical environments. Failure to comply with this requirement is the result of improper migration to the cloud. And it will result in higher costs, loss of business, etc., thus eliminating any benefits that the cloud could provide. A few steps involved in the migration of an application to the cloud include:

migration roadmap

1. Assessing Your Applications and Workloads

This step allows organizations to find out what data and applications can be readily moved to the cloud. During this phase, you can also determine the delivery models supported by each application. It of course makes sense to sort the applications to be ported based on the risk factor, especially the ones with minimal amount of customer data or other sensitive information.

2. Build a Business Case

Building a business case requires you to come up with a proper migration strategy for porting your applications and data to the cloud. This strategy should incorporate ways to reduce costs, demonstrates advantages, and deliver meaningful business value. Value propositions of cloud migration include shift of capital expenditures to operational expenditures, cost savings, faster deployment, elasticity, etc.

3. Develop a Technical Approach

There are two potential service models to migrate an existing application—Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). For PaaS migration, the application itself has to be designed for the runtimes available in the target PaaS. However, for IaaS, the requirements are not that much.

4. Adopt a Flexible Integration Model

An application that gets migrated to the cloud might have already existing connections with other applications, services, and data. It is important to understand the impact of these connections before proper migration to the cloud. The integration model to be adopted may involve three types: process integration (an application invokes another application to execute a workflow), data integration (integration of the data shared among applications), and presentation integration (multiple applications sharing results over a single dashboard).

5. Address Security and Privacy Requirements

Two of the most important issues faced in migration to the cloud are security and privacy issues. Especially in the case of applications that deal with sensitive information, such as credit card numbers and social security information, the security should be high. Several issues to be addressed are there, including the difficulty for an intruder to steal any data, proper notifications on security breach, reliability of the personnel of the cloud service provider, authorization issues, etc.

6. Manage the Migration

After thoroughly analyzing the various benefits and issues associated with migration, planning and execution of the migration can happen. It should be done in a controlled manner with the help of a formal migration plan that tracks durations, resources, costs, and risks.

Our Expertise

In cloud migration, we have industry-wide expertise in SaaS, PaaS, IaaS. In IaaS, we have worked on private and public clouds with infrastructures such as OpenStack, Amazon AWS, Windows Azure, Rackspace, VMware, Cloupia, HP Cloud, etc.

Clogeny, MSys’s subsidiary company, has worked on hybrid cloud migration projects for leading clients in server imaging and datacenter provisioning. We have helped add support for several public vCloud Director implementations, including Bluelock, AT&T, Savvis, and Dell. In addition, we have architected hybrid cloud migration appliance for VMware vSphere. In enterprise Java PaaS, we have worked on VMware vCloud Director, AWS, HP Cloud, and Rackspace.

Conclusion

Cloud computing provides a few key benefits for companies. Migration to the cloud may create a better, modern business model for most tech companies.

Q&A on Mobitaz Android Test Automation Webinar

On the fifth of June, 2014, we conducted a webinar on Mobitaz, MSys’s Android test automation tool. In this webinar, a number of professionals from various companies on testing and general quality assurance participated. Needless to say it was a big success.

It can be challenging for a quality analyst to choose the right mobile-functional-testing tool for his testing purposes. Manual testing or the use of automation tools with limited testing capabilities can be a hindrance in expediting the QA process.

Mobitaz (MSys Android test automation tool) team at MSys explored the need for a device- and an OS-agnostic mobile testing tool which can give the assurance to a QA team that testing isn’t compromised and too much time is not taken in a QA cycle. Discussions made in the Mobitaz webinar will bring a massive change in the traditional automation technique/solutions.

Some of the interesting discussions initiated by the quality assurance personnel from leading companies are as follows:

Parallel execution

Question: Does it mean that I can run a test made for Kit Kat on Gingerbread and ICS at the same time?

Answer: Yes. Concurrent playback on Kit Kat, Gingerbread and Ice Cream Sandwich can be achieved through Mobitaz.

Question: So, Mobitaz adapts to different android objects that differ between OSes? Like progress bar between Gingerbread and Kit Kat?

Answer: Yes. This is something unique about Mobitaz. The tool can record a test case once and play it back across any Android device or OS version. Mobitaz has the intelligence to recognize objects with different Android versions. Through this capability, it can make successful parallel test executions.

Advantages

Q: What are the advantages of this Android test automation tool over other mobile automation tools in the market?

A: We compare Mobitaz directly with other tools which offer a lab-based solution. A few of the advantages over other mobile testing tools are:
• Support for
o Android Custom components
o Android Web-View components
• Parallel execution
• Testing on real devices without rooting
• Detailed reporting with easy option to export and share to PDF format
• Key measurements of resources such as battery, CPU, memory etc.
• Mobile functional testing for Android versions from Gingerbread to the latest version
• Simplified licensing model
• Cost-effectiveness

Script-less Testing

Q: Does Mobitaz Android test automation tool require any scripting knowledge to create, execute, and generate reports?

A: No. Mobitaz is a script-less test automation tool and does not require any programming knowledge for functional testing of mobile apps. Mobitaz has intelligence to manage test cases, through features such as Object Repo, Test Case Editor, Reports, etc.

Eight Tips to Be More Effective in Agile Software Testing

Agile software development happens fast and code releases happen more frequently. Testing in such an environment is very important for coming up with accurate code that works. How does a programmer ensure quality of the code? In agile environment, there are three major challenges:

  • Gathering the requirements and the number of hours committed
  • Creating short-term releases
  • Keeping scrum short for more time for code inspections

As an agile software tester, you should be very proficient with the tools you use. Here are eight tips to be more effective in agile software testing.

1. Character Traits of an Agile Tester

There are a few character traits and mindsets you should be in for being a successful agile tester. Being passionate, creative, and unafraid is important for an agile tester. The agile tester should have soft skills in management, communication, leadership, etc., as well. These skills will help you envision the client’s expectations before the delivery of the product.

2. Understanding the Data Flow

When you know how the data travels inside your application, you are better able to analyze the impact of component failures and security issues. Hence, recognize how the data is used within the application early on in order to report bugs and defects faster.

3. Analyzing the Logs

In agile development, understanding the defect that causes an issue in the application under test involves log analysis. Application logs contain a great deal of information about the system-level architecture of the application. Some of the errors that the tester needs to know about are called “silent errors,” which means the end user doesn’t perceive the effect of the error. Log analysis helps you better spot silent errors as well as work more efficiently with the development team.

4. Risk- and Change-Based Testing

In agile development, development happens on the fly as does testing. The go-to-market time is all that matters, and the teams work together to achieve the best go-to-market time. When the application gets modified, you, the tester, need to understand which parts of the application are being changed. Also, you need to know the overall effect of the change to the final application.

5. Understand the Business Objectives

Agile tester is essentially the end user of the product. Hence, you should know how end users use the product. In order to evaluate your testing strategies, focus on the key areas or parts of the application that an end user is more likely to use. Create separate strategies for product architecture and end users. Also, this end-user-specific categorization allows you to report bugs based on the application’s business objectives, i.e., prioritizing the defects. At the end of the day, meeting end-user requirements is what any business needs. Based on the user stories, QA teams prepare the acceptance criteria.

6. Browser Tools

Browser plugins and tools may be highly effective for agile testers sometimes. For instance, Google Chrome and Firefox come with developer tools in-built to allow testers immediately spot errors. Also, there are third-party browser plugins such as FireBug that testers can use.

7. Requirement Repositories

Understand what type of agile development strategy your organization uses—Adaptive Software Development (ADP), Agile Unified Process (AUP), Kanban, Scrum, etc. Documentation of test cases and scenarios that the development and testing team create together is very important. Over time, the requirements and test scenarios are gathered into a repository-style system, from which a tester can get a lot of information.

8. Test Early, Often, and Continuously

Exploratory Testing (ET) is a practice in which testing is instantaneous. This is very important in agile development. Many testing professionals believe that the testing should be as early, often, and continuous as possible for proper application delivery. All types of testing—functional, load, etc.,—should be put within the project plan.

Conclusion

In agile software development, rather than the end-product, the development stages are important. Hence, testing is an integral part of the development process. In the early days of software testing, the quality assurance personnel did not have high level access to what is being tested or the results. With agile movement, the software companies and professionals have a more real-time view of the testing environment and scenarios. In agile development, there are shorter iterations leading to smaller test cases. Using a good test automation solution can be helpful in coming up with faster builds.

In order to provide a quality product to customers in a short delivery time schedule, MSys opted for agile testing approach from the conservative waterfall or V-Model, which paved way for us to address the continuously changing requirements and quality feedbacks of the customers.

Big Data and Your Privacy: How Concerned Should You Really Be?

Today, every IT-related service online or offline is driven by data. In the last few years alone, explosion of social media has given rise to a humongous amount of data, which is sort of impossible to manipulate without specific high-end computing systems. In general, normal people like us are familiar with kilobytes, megabytes, and gigabytes of data, some even terabytes of data. But when it comes to the Internet, data is measured in entirely different scales. There are petabytes, exabytes, zettabytes, and yottabytes. A petabyte is a million gigabyte, an exabyte is a billion gigabyte, and so on.

A Few Interesting Statistics

Let me pique your interest with a few statistics here from various sources:

  • 90 percent of data in existence in the world was created in the last two years alone.
  • Let’s look at Facebook for instance, there are 54 million pages, and every twenty minutes a million links are shared, two million friend requests happen, and three million messages are sent. On the top of these, there are over 81 million fake Facebook accounts.
  • The reason why Amazon sells five times Wal-Mart, Target, and Buy.com combined is because the company steadily grew to be of 74 billion dollar revenue from a miniature bookseller by incorporating all the statistical customer data it gathered since 1994. In a week, Amazon targets close to 130 million customers—imagine the enormous amount of big data it can gather from them.

Google’s former CEO and current executive chairman, Eric Schmidt, once said: “If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.” The significance of this statement is evident when you realize the magnitude of data that the search giant crunches every second. In its expansive index, Google has stored anywhere between 15 to 20 billion web pages, as in this statistic.

Google index

On a daily basis, Google processes five billion queries. Beyond these, through numerous Google apps that you continuously use, such as Gmail, Maps, Android, Google+, Places, Blogger, News, YouTube, Play, Drive, Calendar, etc., Google is collecting data about you on a huge scale.

All of this data is known in the industry circles as “big data.” Processing such huge chunks of data is not really possible with your existing hardware and software. That’s the reason why there are industry-standard algorithms for the purpose. Apache Hadoop, which Google also uses, is one such system. Various components of Hadoop–HDFS, MapReduce, YARN, etc.–are capable of intense data manipulation and processing capabilities. Similar to Hadoop, Apache Storm is a big data processing technology used by Twitter, Groupon, and Alibaba (the largest online retailer in the world).

The effects and business benefits of big data can be quite significant. Imagine the growth of Amazon in the last few years. In that ginormous article, George Packer gives a “brief” account of Amazon’s growth in the past few years: from “the largest” bookseller to the multi-product online-retail behemoth it is today. What made that happen? In essence, the question is what makes the internet giants they are today? Companies such as Facebook, Google, Amazon, Microsoft, Apple, Twitter, etc., have reached the position they are today by systematically processing the big data generated by their users–including you.

In essence, data processing is an essential tool for success in today’s Internet. How is the processing of your data affecting your privacy? Some of these internet giants gather and process more data than all governments combined. There really is a concern for your privacy, isn’t there?

Look at National Security Agency of the US. It’s estimated that NSA has a tap on every smartphone communication that happens across the world, through any company that has been established in the United States. NSA is the new CIA, at least in the world of technology. Remember about PRISM program that the NSA contractor Edward Snowden blew the whistle on. For six years, PRISM remained under cover; now we know the extent of data collected by this program is several times in magnitude in comparison to the data collected by any technology company. Not only that, NSA, as reported by the Washington Post, has a surveillance system that can record hundred percent of telephone calls from any country, not only the United States. Also, NSA allegedly has the capability to remotely install a spy app (known as Dropoutjeep) in all iPhones. The spy app can then activate iPhone’s camera and microphone to gather real-time intelligence about the owner’s conversations. An independent security analyst and hacker Jacob Appelbaum reported this capability of the NSA.

NSA gets a recording of every activity you do online: telephone and VoIP conversations, browsing history, messages, email, online purchases, etc. In essence, this big data collection is the biggest breach of personal privacy in human history. While the government assures that the entire process is for national security, there are definitely concerns from the general public.

Privacy Concerns

While on one side companies are using your data to grow their profit, governments are using this big data to further surveillance. In a nutshell, this could all mean one thing: no privacy for the average individual. As far back as 2001, industry analyst Doug Laney signified big data with three v’s: volume, velocity, and variety. Volume for the vastness of the data that comes from the peoples of the world (which we saw earlier); velocity to mean the breathtaking speeds it takes for the data to arrive; and variety to mean the sizeable metadata used to categorize the raw data.

What real danger is there in sharing your data with the world? For one thing, if you are strongly concerned about your own privacy, you shouldn’t be doing anything online or over your phone. While sharing your data can help companies like Google, Facebook, and Microsoft show you relevant ads (while increasing their advertising revenues), there virtually is no downside for you. The sizeable data generated by your activities goes into a processing phase wherein it is amalgamated to the big data generated by other users like you. It’s hence in many ways similar to disappearing in a crowd, something people like us do in the real world on a daily basis.

However, online, there is always a trace that goes back to you, through your country’s internet gateway, your specific ISP, and your computer’s specific IP address (attached to a timestamp if you have dynamic IP). So, it’s entirely possible to create a log of all activities you do online. Facebook and Google already have a log, a thing you call your “timeline.” Now, the timeline is a simple representation of your activities online, attached to a social media profile, but with a trace on your computer’s web access, the data generated is pretty much your life’s log. Then it becomes sort of scary.

You are under trace not only while you are in front of your computer but also when you move around with your smartphone. The phone can virtually be tapped to get every bit of your conversations, and its hardware components–camera, GPS, and microphone–can be used to trace your every movement.

When it comes to online security, the choice is between your privacy and better services. If you divulge your information, companies will be able to provide you with some useful ads of the products that you may really like (and act God on your life!). On the other hand, there is always an inner fear that you are being watched–your every movement. To avoid it, you may have to do things you want to keep secret offline, not nearby any connected digital device–in essence, any device that has a power source attached.

In an article that I happened to read some time back, it was mentioned that the only way to bypass security surveillance is removing a battery from your smartphone.

The question remains, how you can trust any technology. I mean, there are a huge number of surveillance technologies and projects that people don’t know about even now. With PRISM, we came to know about NSA’s tactics, although most of them are an open secret. Which other countries engage in such tactics is still unknown.

Advantages of DevOps Continuous Delivery Model

You may be wondering what DevOps means. According to Wikipedia, it’s a portmanteau of Development and Operations, two integral parts of any software firm. Development teams work hand in hand in coming up with a software product or service, and the Operations gets it into production.

In essence, these two parts of the firm face slightly different challenges. For instance, the software market is quite expansive, and when it comes to a company such as MSys, it is also quite dynamic. Developments happen on a daily basis in storage, embedded systems, telecommunications, quality assurance, virtualization, etc. Operations team has to have first-hand knowledge on these developments and should be in a position to react quickly to market dynamics. Operations also has to keep hardware systems in pristine condition to support development lifecycles.

When it comes to software development, challenges may be more technical–related to timely releases, quality assurance of the releases, additional requirement gathering, and constant communication. However, when we analyze deeply, we may feel that the challenges are more or less alike or overlapped. This is one of the reasons why companies largely have decided to adopt a model in which the development side works hand in hand with the operations side. MSys also has a similar structure, wherein, the operations team has sophisticated knowledge of the development lifecycle, and products.

A few stereotypes have been attributed toward Development and Operations. For instance:

  • Developers are constantly lazy and not interested in deployment and operations.
  • Operations always blame developers for failure of the application or deployment.
  • Operations always complain they are kept out of loop in feature enhancements and new developments.
  • Operations are not concerned about code, and developers are not concerned about business growth.

In a modern IT business, these differences in opinion can be costly. A hand-in-hand approach, wherein developers and operations work together can have many benefits.

1. Business Change & Growth

Especially in this fast-paced world, business changes happen quite often. Keeping abreast with these changes is significant to the growth. And the team that works with clients, manages all activities, and regulates business is the operations team. If it doesn’t get adequate help from Development, growth can be stunted. This is one of the reasons why DevOps is expected to bring enormous business growth in the coming days.

2. High Quality Releases More Frequently

This is an objective of any software development firm, isn’t it? High value earlier! It’s an objective of DevOps too. Frequency in high quality releases can be easily achieved with constant communication between Development and Operations.

3. Everyone Knows What’s Going On

Shared version control systems in which operations teams have inside knowledge of software lifecycle can be achieved using certain version control systems. Building and deploying in one step is possible today. This ensures what changes were applied, when, and by whom.

4. Simple Code Improvements

In agile development environment, frequency and magnitude of changes can both be small. However, these minor changes can sometimes significantly impact a product. A DevOps team can manage minor code modifications and improvements more efficiently through Continuous Integration.

5. Improves Interpersonal Relationships

If there is no longer a shift between teams, there is no longer difference in opinion. In essence, people working together create a better community and interpersonal relationships between each other. This is one of the ways DevOps improves a company’s culture. This in turn improves each team’s outlook toward any failure.

Conclusion

Overall improvement of a company’s products and services is achievable with DevOps. The aim is still on improving the delivery model, satisfying more customers.

How Embedded Systems Transforms the Healthcare Industry?

Imagine how cumbersome healthcare used to be in the past. Back then, a person not feeling well had to approach a doctor, who then proceeded to prescribe medicines based on his external symptoms. How accurate can the diagnosis be in such cases? The reason why a few decades ago a disease that we take for granted today could kill masses was because the diagnosis wasn’t thorough. Then the technology advanced. We got X-ray, ECG, EEG, MRI, CT, pulse oximeters, GlucoWatches, electronic defibrillators, and a large number of sophisticated gadgets (embedded systems) and acronyms that the general public has no idea about. Now, the technology is even more advanced. New microchips, nanotechnology, and embedded systems have managed to revolutionize the healthcare industry.

Look at General Electric, the multi-billion dollar vendor of all kinds of electric systems. GE is the premier provider of medical embedded systems in the world. All kinds of technologies–from scanning machines, imaging systems, and diagnostic equipment–are there in GE’s range. Behind all these advanced systems is embedded technology. Take a look at this image of a huge PET scanner from GE, a perfect example of an embedded system:

GE PET scanner

[Image Source: General Electric]

I was perusing the acme of technology IEEE Spectrum, and I stumbled upon an article that describes what the future has in store for us. In the next few years, newborn babies will get tiny sensors within the first few minutes of their birth. A chip that is planted in the body of the infant continuously monitors its health condition, and the biometric data generated and stored in the cloud by the child through this chip within two years will be more than the entire amount of data created by everyone combined in the world today. In essence, this data can be used by medical professionals to track every aspect of the health of the child.

Soon, medical gadgets will turn out to be more glittering and sophisticated than the ones in the books of Ian Fleming. A few days ago, BBC reported of a gadget–a tiny ring–that reports and catalogs a person’s medical conditions. This is a perfect wearable that comes handy in emergency situations as a microchip embedded inside this ring alerts paramedics during an emergency.

health technology ring
[Image Source: BBC]

You have probably heard already about electronic tattoos that dilate with your skin. These temporary electronic tattoos are powered by solar energy and replace those bulky gadgets, such as a pacemaker, to monitor the health conditions of individuals. Since the material used is stretchable, sturdy, and highly flexible, you will not even know you are wearing a health monitor.

When it comes to advanced robotics for intricate surgical procedures, check out the da Vinci Surgical System, manufactured by Intuitive Surgical, Inc. This is the only robotic surgery system with approval from the US Food and Drug Administration (FDA).

da Vinci Surgery system
[Image Source: Intuitive Surgical, Inc]

But now, a team of geeks from University of California, Santa Cruz and the University of Washington have successfully created a set of seven robotic surgery systems for use by medical research labs across the US. These systems use open-source approach for software development, cutting the cost of ownership to the bare minimum.

When it comes to embedded technology within gadgets, you probably know about a number of technologies. There are real-time operating systems in the embedded world that find applications in military-grade equipment. Examples like QNX (acquired by BlackBerry), OSE, VxWorks, ucLinux, and LynxOS come into mind.

From small embedded systems that monitor the heart rate or identifies a blockage in an artery, the technology seeped into intricate surgical procedures. It is quite possible that in the future you can own your own robot doctor, with or without remote assistance from a real one. As the functionality of embedded systems in use in healthcare increases, one thing that decreases is their size. A recent article in Discover suggests a possible device–a “microbot”–about a decade from now that can be inserted into your body by making a tiny surgical incision. The microbot can travel through your blood vessel and reach the area of concern. It can fix minor issues, such as blockage in an artery, and collect tissue specimens for testing. A tiny camera attached to this device can send high-definition images and videos to the doctor about what is happening in your body. The futuristic microbot can be powered by a tiny motor about the width of two human hairs.

As you can see, medical technology has advanced quite a bit with the help of embedded technologies. Stretching the limits of the Moor’s law, semiconductors, processors, and chips are going down in size in an exponential fashion, while the number of transistors in each chip is growing by leaps and bounds. SoCs, embedded operating systems, and software that power these devices undergo some serious R&D. MSys also has quite a bit of experience in embedded technologies and real-time OS (RTOS) making us a perfect innovator for the technology of tomorrow.

Related Article:

Internet of Things: An Introduction

Internet of Things: What the Future Has in Store!

Imagine your washing machine calling on your smartphone and telling you in a Siri-like voice that it’s time to wash your socks. Imagine you receiving texts on your phone about your garage door being left open, your car out of fuel, or your toaster finishing its work. In the near future, this may no longer be science fiction; it is very much possible that any object you have in your home will start talking to you–in fact, not only to you but to any other objects around. What enables this is a technology known as Internet of Things (IoT).

What is this revolutionary new technology? What is Internet of Things? This term has existed and been hackneyed since the early ’90s so as to breach the boarders of cliché. People have come forward with certain other terms to substitute “Internet of Things,” but most of them turned out to be just bush-league. Internet of Things refers to a future in which your commonplace objects—things that you normally do not associate with technology—start to communicate as part of a network. This concept radically augments our idea of the smart planet, because you can communicate to not just computers but every object in your home over the Internet.

IoT and the Fascinating Future

If you are a fan of popular sitcom Big Bang Theory, it has an interesting episode in which the characters light lamps and turn down stereos using their laptops by sending signals across the Internet. After a while, by giving open access, unknown people start playing around with the lamps in their apartment. This kind of development is highly invigorating as well as slightly intimidating for many people.

While on one side people are talking about the advantages of IoT, a discussion is looming large on the horizon about the security concerns surrounding the concept. For instance, what if the bad guys hack into your smartphone to disable your home’s security system and open the doors of your house while you are away in Hawaii on a vacation?

One area IoT is going to transmogrify is the automotive industry. Already, the cars are as smart as you want them to be. A few days ago, I was watching the Audi keynote in the International CES (watch it below), and wow! The cars can not only park themselves but drive you through busy streets. The technology is that sophisticated now. Last week, Fox News published a piece on V2V (Vehicle to Vehicle) communication system that helps cars communicate with other cars in the vicinity to convey important information, such as whether or not the driver is applying breaks properly to avoid a possible collision. The US department of transportation is considering a regulatory proposal for vehicle to vehicle communication. You can go to BBC Top Gear and be literally flabbergasted at the automotive technology that is emerging. In essence, cars have advanced through technology in the last decade more than they ever did in a century led by mechanical engineering. Embedded computing technology is at the helm of all these developments.

When IoT comes to our world, these cars will be well connected, through 4G technology. They will communicate fluently to bring assistance to you wherever you are.

Cisco has done quite a bit of research on Internet of Things (which they call Internet of Everything in their vernacular). Check out that site; it’s a goldmine of information on IoT. According to Cisco’s findings, released in Feb 2013, IoT globally will be worth 14.4 trillion dollars in the next decade. I happened to look at the data concerning our country too, and the value at stake seems to hover around 35 billion USD. When it comes to the zenith of information technology, the United States, the total value at stake seems to be 473 billion dollars.

IoT: How It Changes Your World?

I have given you a slight idea of how IoT is going to change your future in the beginning of the article. While some of the ideas may be a bit out there, there is virtually no bounds to the way applications can be developed to incorporate things. Embedded systems will be subsumed into almost every object to make it more intelligent. That’s where the washing machine that talks and texts comes in. These developments can significantly improve your lifestyle.

Just as the lot of geeks in the Big Bang Theory, IoT will exhilarate the techies amongst us. They will come up with specialized applications that do everything from garage-door-opening to toilet-seat-lifting. The way IoT can uplift the services in certain industries today is bound only by your imagination. Surveillance, security, healthcare, education, retail, etc., are some of the industries that will taste the massive benefits of Internet of Things.

There is a minor problem. And that concerns software development. For an analogy, consider today’s mobile app development. While a developer needs to concentrate on only one device (or two) in case of iOS development, he has to consider a plethora of hardware configurations, resolutions, processors, and OS versions when it comes to Android. Imagine if a developer needs to create an app that controls refrigerators or washing machines? There is more diversity there than the number of verses in the King James Bible, figuratively speaking.

This development intricacy has also been discussed by a recent podcast in GigaOM. They also discuss the rampant privacy concerns surrounding the subject. Play the podcast to listen:

How IoT redefines our world is well illustrated in this image:

internet of things

As a first step of inventorying everything to be managed better, you can use technologies such as RFID (Radio-frequency ID) and NFC to tag each object. Then, they can be managed through a network, and locating and securing the inventoried objects become a piece of cake.

There is, however, one little issue concerning IoT: the standardization. We should come up with a way to standardize the tagging technologies that we use—RFID, NFC, barcodes, or QR codes. It should not be as wayward as in the case of 4K resolutions (wherein there are six different resolutions and no fixed standard).

In essence, for coherence and congruence, everything from development to nomenclature should follow a standard.

How far are we in realizing IoT in our cities? Well, when it comes to Ubiquitous Cities (aka smart city, wherein everything is connected with computers), Songdo IBD of South Korea is probably the first. It is a smart city where everything is connected, not just computers.

Conclusion

I could go rambling on and on about IoT as it is quite an interesting topic. MSys’s development teams have expertise in embedded computing technologies, which is right there at the brink of IoT. It is inspiring to know that we are part of a global team working toward the future of technology.

The Business-Level Importance of Software Testing

Today, software development is no longer what it used to be. In the past, you had only one or two platforms to consider while developing software applications. Also, code releases and new versions used to come out once in a year or so. Testing was a relatively relaxed process. Today, we have a plethora of operating systems in both mobile and desktop platforms. In the mobile world, you have Android, Apple iOS, Windows Phone, and BlackBerry. Although BlackBerry and Windows Phone occupy a diminutive slice of the market, you still can’t ignore them. Software testing assumes great importance in the case of agile development where coding is a continuous process.

Quality Assurance of Your Apps

Imagine your bank provided an Android app for such activities as net banking and funds transfer. During development, the bank tested the app only on the Samsung Galaxy family of devices owing to the fact that it was highly popular. Hence, when you open up the app within your Nexus 5, it provides ineffably bad performance with such UI faux pas as invisible buttons, overlapping text boxes, and misaligned menus. You can imagine the poor user experience it would give you.

The bank cannot expect all of its users to have Galaxy phones, can it? Android is a serious challenge for testers due to its many versions, and it is adopted by many OEMs in a multitude of hardware configurations. As Android is a free, no-strings-attached operating system, OEMs go out of their way to modify it, developing their own user interfaces. While this freedom proved to be one of the reasons behind the enormous success of Android, it also made the OS flavors incredibly diverse.

This device diversity makes testing an Android app an intricate task. Had your bank done proper testing on your Android app with the help of an emulator and a test automation suite such as Mobitaz, the app would have been much better.

Types of Software Testing

There are mainly two types of software testing: white-box and black-box testing. The difference between them is minor. In the black-box method, testing is based on the output generated across a range of inputs submitted; in essence, the internal code structure is not evaluated. In white-box testing, aka glass-box testing, the tester takes into account the internal mechanism of the software.

Besides these, there are a number of areas a software tester needs to take care of: functionality, system, stress, performance, etc. All these different types of testing ensure that the application runs smoothly and provides all the functionalities expected.

Testing in the Android World

Android app testing may be a little more complex than testing desktop applications. As mentioned earlier in the case of the banking application, the tester needs to take care of a huge number of devices. Look at the diversity in the Android market as of February 18, 2014.

Version

Codename

API


Distribution


2.2

Froyo

8

1.3%


2.3.3 -

2.3.7

Gingerbread

10

20.0%


3.2

Honeycomb

13

0.1%


4.0.3 -

4.0.4

Ice Cream
Sandwich

15

16.1%


4.1.x

Jelly Bean

16

35.5%


4.2.x

17

16.3%


4.3

18

8.9%


4.4

KitKat

19

1.8%

Android diversity in OS

Android devices, although very sophisticated, do not provide the level of performance of a laptop or a desktop computer. Hence, performance testing of your app is very important. What if your app slows down the entire system thereby frustrating the users?

Also, some additional testing methodologies, such as regression testing and unit testing, assume importance in the case of Android apps. Regression testing is a method in which a modified component is tested for its effect in the entire system. Unit testing is done on a single unit or a group of related units. A unit may be one or more software components grouped together.

Businesses Depend on Software Testing

If you are into software development, your aim should be to bring your application to the market faster than your competition does. Also, you cannot afford to offer a half-baked product. Your agile development and testing teams have to work hand in hand in order for the app to be stellar in every aspect. A tool to automate the testing of your app may spell the difference between failure and success in such scenarios.

Especially in mobile app development, where the competition is extremely hot, you really need to bring your app out fast. In such cases, an excellent test automation solution is a necessity.

Conclusion

Android is an operating system that enjoys continuous development. New features are added continually. Your app should be capable of taking advantage of this aspect of Android. Multi-touch gestures are one such feature that many test automation suites failed to incorporate for several years; however, Mobitaz supports it as well as other difficult-to-automate components as found in hybrid apps. MSys’s Mobitaz has been quick to catch up to the market with all the necessary features of a robust test automation solution.

Replicating a Virtual Machine: Process, Tools, and Technologies

Virtualization has transformed the IT infrastructure as you know it. This is a technology domain with a slew of advantages. With such benefits as cost savings, energy savings, faster provisioning of servers, increased uptime, to improved disaster recovery, virtualization has truly overhauled the IT infrastructure. Companies including VMware and Microsoft are competing neck and neck in the virtualization domain. VMware products and Microsoft Hyper-V are at the helm of virtualization technology. Another hypervisor that assumes great importance is Citrix XenServer. In this article, focus is on replication of virtual machines. Why is it important to replicate a virtual machine?

MSys is an organization with expertise in all three VM platforms—VMware, Microsoft Hyper-V, and Citrix XenServer. As a first step of analyzing virtual machine replication, let’s learn why it is important.

Virtual Machine Replication in Layman’s Terms

Large servers that provide various services to client systems worldwide make use of the virtualization technology for its obvious advantages. Scalability, upgradability, compatibility, and such other radical advantages of distributed computing and the client-server model are augmented with virtualization. One major aim in the maintenance of these servers is keeping them alive at all times, feeding their clients. As nothing is beyond impairment, a server can go offline at any time. The downtime could even stretch to more than you expect. How, then, is it possible to maintain services to clients? In such scenarios, the best way a virtual machine technology helps is by replicating the virtual machine that runs in the server.

Replication is exactly as it implies in the real world. A copy of the server operating system is created and moved to a different, robust, online location. The copy then continues to serve the clients.

Replication is advantageous in several scenarios:

  • Failure of a server causing services to go down
  • Configurations of the server and its replica are different
  • When server state restoration is required for a server or its replica
  • For disaster recovery
  • For Infrastructure as a Service (IaaS) cloud offering for enterprises

Although it is said pretty quickly, there is more to replication than simple copying.

VMware and Hyper-V provide specialized tools and technologies for quick replication of their services. VSphere from VMware is a versatile and popular cloud-computing virtualization operating system. Being a top player, VMware vSphere has all facilities of a hypervisor including replication. VSphere Replication is provided gratis with all vSphere licenses from Essentials Plus to Enterprise Plus.

Quiescence and Consistence

In case of a virtual machine replication scenario, you don’t want the replication to continue while data within the virtual machine is still being modified (written or read). Inconsistencies will fail the virtual machine at the receiving end. In order to maintain consistency, virtual machines make use of a tool such as Microsoft VSS (Volume Shadow Copy Service or Volume Snapshot Service). In a previous post, I had mentioned about VDS and VSS, in which VSS is an essential technology used to prevent data conflicts on VM replication.

Retention of historical VM states is a major part of the replication process. If a storage administrator wants to pick an older configuration and bring it to the front, historic data of replication will be helpful. VM replication process includes a way to retain historic copies for a period of time.

The Replication Process

VM replication

A part of the VMware installation package is a replication agent. Replication can be done quickly for as many as 500 virtual machines created with VMware. In detail, the process involves some simple steps.

VSphere replication initially performs a synchronization of the source VM and its replica. For making it easy, the replication process may also use a seed copy of the data. After the baseline synchronization process, vSphere replication process synchronizes only the data blocks that have been modified. As a result, the synchronization process itself becomes faster.

VSphere Replication protocol ensures that the replication is done only when data is modified in the original virtual machine. The protocol is lightweight and helps replication to any location, despite whether the destination has vSphere replication functionality or not. Also, this versatile system can manage over 500 virtual machine replications within one instance.

MSys’s Engagement in VM Replication

VMware provides replication systems; however, in heterogeneous virtualization environments involving other virtual machines such as XenServer and Hyper-V, companies look for third-party replication products. MSys has extensive knowledge in major virtualization platforms and is engaging with clients for developing such replication products.

Storage, Virtualization, Testing, Test Automation, Embedded Systems