Table of Contents
When building web products, you would have come across scenarios where the end-user experience breaks when users are accessing the product from certain browsers or devices. Isn’t this something common for many web (or app) developers? Definitely, it is!
Such an experience is primarily due to the lack of compatibility testing on different browsers, browser versions, and operating systems (or platforms). Such a form of testing is called cross browser testing or cross compatibility testing. There is a significant amount of fragmentation in the browser and OS market since users (or consumers) have the flexibility to access your app from their choice of browser, browser version, and platform.
Shown above is the desktop browser market share, an indicator that it’s important to focus on providing a top-notch experience on different browsers. Your users would abandon your product if they face functional issues, lags, etc. when accessing the product from their choice of device(s) and browser(s). This answers the point related to browser fragmentation, the same argument also holds good for device fragmentation.
Let’s deep dive a bit further. Shown below is the browser version market share which indicates that Chrome 93.0 – quite an old version of Chrome, still holds 0.06 percent market share as of May 2022. At the time of writing this blog, the latest version of Chrome is 104.0.5112.101. This same principle also applies to other popular browsers such as Firefox, Safari, and others.
What if your team does not perform cross browser tests on Chrome 93.0 since it might be used by your target audience? This could lead to catastrophic drop in DAU (Daily Active Users), MAU (Monthly Active Users), as well as the overall revenue. In this blog, we would cover the essentials of cross browser testing, including its best practices. The points being covered in the blog are agnostic of the browser or browser version being used by the audience of your target market.
In simple terms, browser compatibility testing, also known as cross browser testing, is a type of testing where compatibility aspects of the app or website are tested on different browsers, browser versions, and operating systems. The idea of testing is to provide a uniform experience to the users across browsers and devices used by them.
Since customers have the option to access web apps (or websites) from tablets, phones, and desktops; it’s essential to perform cross browser tests on popular viewport sizes. Selenium, Cypress, Playwright, and Puppeteer are some of the most popular test automation frameworks for performing browser compatibility testing.
It is recommended to partner with an experienced compatibility testing company in order to achieve best test and browser coverage for your product. You should consider running cross browser tests using cloud-based infrastructure like AWS, Azure, etc. A much better option to scalable cross browser testing is running tests on cloud-based grids offered by providers like LambdaTest, BrowserStack, Sauce Labs, etc.
Now that we have covered some of the basic aspects of compatibility testing, let’s deep dive into some of the major benefits of compatibility testing. Manual testing is not a recommended approach, hence we would cover the benefits primarily from an automated perspective.
We have already covered why it is important to test the website (or app) features from different browsers and operating systems. However, the ROI of cross browser testing multiplies if the tests are performed on a wide range of browsers and operating systems.
A browser compatibility matrix can be helpful in prioritizing the combinations that matter the most to your target market. In short, browser compatibility testing is instrumental in improving the test & browser coverage; thereby improving the overall product quality.
Parallel test execution at different levels (i.e. thread, class, method, etc.) aids in faster execution of the test scenarios. You can accelerate the time to market by leveraging the parallel execution capabilities offered by the test automation framework as well as the potential of the cloud-based test infrastructure.
Consider a test suite that has 100 test cases and each test case takes 1 minute for completion. If the tests are executed in series, the total execution time will be 100 minutes. On the other hand, the total execution time will be 20 minutes if 5 tests are executed in parallel on different browser and OS combinations.
Test report is one of the integral parts of any testing activity. Product managers, project managers, test architects, and other project stakeholders can get detailed insights from the test reports. Test reports provide information on the number of tests run in a particular release, pass/fail ratio, test coverage, and much more. Test report is a true reflection of the overall quality of the product.
Automated browser compatibility testing tools generate detailed test execution reports that can be downloaded as well as shared with the concerned project stakeholders. Since the tests are executed on the cloud, team members in the project can access the test reports (present as well as past) at any point in time.
It is recommended to onboard an experienced compatibility testing services company so that you can get the maximum ROI out of cross browser testing.
Now that we have looked into the benefits of browser compatibility testing, let’s deep dive into cross browser testing best practices.
As discussed earlier, the browsers and operating systems used by your customers for accessing your product could differ from one market to another. For example – iOS holds close to 55 percent market share in the US market, whereas Android holds close to 95 percent market share in the Indian market. The same principle also applies for desktop (or tablet) based browsers and operating systems.
As discussed earlier, the browsers and operating systems used by your customers for accessing your product could differ from one market to another. For example – iOS holds close to 55 percent market share in the US market, whereas Android holds close to 95 percent market share in the Indian market. The same principle also applies for desktop (or tablet) based browsers and operating systems.
It is practically impossible to cover all the browser and OS combinations owing to the huge amount of fragmentation in the web browser & OS market. Hence, you need to prioritize the browser and operating systems depending on your target market.
It can be overwhelming to test on a range of combinations, hence take up the ones that matter the most for your business.
Like all the other forms of testing, it is recommended to kick-start cross browser testing at an early stage of product development. Be it Selenium or Cypress or Playwright or any other test automation framework, all of them support integration with popular CI/CD tools like Jenkins, Circle CI, Semaphore CI, and more.
Due to these integrations, you can run browser compatibility tests as a part of the CI (Continuous Integration) pipeline. The best part about this approach is that you get detailed reports of the test execution. You can also reduce the developer feedback time since issues are reported to the right team members, as and when they are discovered in the pipeline.
All of this lets you unearth usability and cross browser issues at early stages of the product development.
Not all the issues discovered through cross browser tests are unique to the browser on which the tests are run. This particularly applies to styling issues that could be uncovered during browser compatibility testing.
You need to follow some basic sanity when running the tests, the major ones are mentioned below:
Popular frameworks like Selenium, Cypress, etc. must be leveraged for automated browser compatibility testing. Since these frameworks can be used for unit testing as well as E2E (End-to-End) testing, the tests must be designed in a manner that they are less flaky (i.e. their behavior is deterministic).
This is possible if there is zero dependency between the tests so that the result of one test has no impact on the other tests that are part of the test suite. Autonomous test design must be followed when developing the test cases.
Cross browser testing can turn out to be a costly affair if the testing is performed on a local infrastructure (or Grid). Apart from continuous maintenance and upgradation of the infrastructure, you would also need manpower to ensure that there is minimal (to no) downtime when executing the tests.
It is recommended to shed these infrastructure costs by opting for cloud-based grids that let you perform browser compatibility testing at scale! Since there are so many companies that offer cloud-based cross browser testing infrastructure, you need to onboard the best compatibility testing company suited to your budget and requirements.
Earlier in the blog, we touched upon the importance of faster feedback time. Since cross browser tests are run on a range of browser and OS combinations, it’s essential to leverage the benefits of parallel test execution offered by the test automation framework and the testing infrastructure.
Report generation, issue reporting, parallelism, and other such integral aspects must be leveraged to reduce the test execution time by a huge margin. Irrespective of the test automation framework and infrastructure being used, parallel test execution is something that must be considered when triggering every test.
Apart from the above best practices, you must also consider testing on real mobile devices instead of emulators (or simulators) since it lets you test the features of your mobile application in a wholesome manner.
It is a known fact that there is a huge fragmentation when it comes to browsers, view port sizes, browser versions, and platform versions. Hence, you need to invest heavily into browser and platform compatibility testing for ensuring that the users of your website or app do not witness any usability or compatibility issues when accessing the product from their preferred device.
However, opting for the best practices of cross browser testing is mandatory so that you can make the most of the testing activity. The cross browser testing best practices covered in this blog can help in realizing the compatibility testing needs at scale!
Do share your preferred browser testing best practice in the comments section of this blog…
Join the like-minded community & get the latest updates and insights on software testing technological transformation around the globe. Don't miss out.