ECommerce

E-commerce & Web Intelligence: Gaining a Competitive Edge; Interview with Lauris Lietavietis, Chief Sales Officer at Oxylabs

E-commerce & Web Intelligence: Gaining a Competitive Edge

As the e-commerce industry continues to grow, the competition keeps increasing as well. Finding new ways to get an edge against the competition is always a challenging topic. Public web data, according to Lauris Lietavietis, the Chief Sales Officer at Oxylabs, is one such way.

We sat down with Lauris to discuss his experience and expertise in combining public web data with e-commerce. He shared a high-level overview of how e-commerce companies can gain that sought-after edge against their competitors by using web scraping and big data.

Lauris is set to present a more in-depth analysis of his ideas at the DMEXCO conference, providing precise examples of how companies are already using public web data and how others can do the same.

Your presentation at DMEXCO is about “Using Real-Time Public Data for Competitive Advantage in E-Commerce.” How has e-commerce changed with the increased availability of public web data, and how do you see it changing in the future?

We’ve seen immense growth and adoption of programmatic commerce over the past few years. We take it to mean automated, data-driven tactics to optimize product offering, pricing, and advertising. 

Programmatic commerce essentially combines the predictive power of data, changing consumer needs, and inventory optimization together. Companies using web intelligence, among other data sources, can now optimize their inventory and offerings ahead of time.

There’s a clear incentive to monitor and collect data from competitors as such usage enables many new strategies. But do you see any value in scraping data from non-competitors or even completely different websites for e-commerce companies?

Outside of programmatic commerce, there’s a lot more use cases for web scraping. E-commerce companies are not just stores with a website. They are companies with a brand, name, reputation, and much more.

Amazon is a great example. They’re the largest e-commerce platform in the world but also have a lot of legitimacy to their name. Reputation management and consumer sentiment analysis are also prevalent use cases for e-commerce businesses.

Clearly, a lot can go well if an e-commerce company integrates web scraping solutions into their operations. Yet, what are the business risks and costs? How can a business approach such integrations in a way that would minimize costs and maximize profits?

I think it’s a question of building vs. buying – which part of the solution can be built in-house and what needs to be outsourced. It’s not just a binary decision where you either buy or build. You can buy some parts and build the others.

When it comes to the costs, the largest costs would be associated with the infrastructure layer – proxies (IP addresses) that enable scraping. Having that in-house would be quite a large investment. 

You can also outsource the entire scraping pipeline through services like the ones Oxylabs provides. All your teams would do is select sources from which to scrape data and of what kind. You would then use that data for various implementations.

Finally, more and more companies are providing ways to get to the data itself without setting up the scraping. We recently began offering datasets, which are prepared for our customers ahead of time. There’s no scraping setup, development, or proxy management involved for them. All they need to do is analyze the data.

I’ve heard from people deeply involved with web scraping that e-commerce businesses, while being one of the most prolific users of these solutions, still have only scratched the surface. What sort of data or strategies do you think e-commerce businesses are missing out on in terms of web scraping? 

I think many e-commerce businesses are overall well-developed in web intelligence implementation, especially when compared to others. There are quite a few players who are very advanced in the field. 

There may be differences in which strategies they choose (e.g., some may limit themselves to dynamic pricing while others may use it and inventory optimization), but across the advanced players of the sector, the usage of web intelligence is quite immense.

Yet, there is still a lot of opportunity for many mid-size and smaller e-commerce businesses to adopt public web data. Scalability and pricing are increasingly becoming less of an issue, making the solutions more accessible to them.

I’d say that overall, it’s still about the adoption of different programmatic e-commerce solutions. Comparatively, few have implemented programmatic e-commerce solutions, as these are the next step in the evolution of public web data.

Finally, there’s a ton of opportunity outside of e-commerce, within other verticals, where public web data can be utilized to improve business operations and performance.

What would you recommend to e-commerce businesses that want to integrate web scraping into their processes but have yet to do so? Is there a good step-by-step approach to doing so?

There are plenty of things that can be done, but first the main problem that is intended to be solved has to be defined – is it price, offering, ad campaign optimization? There is no point in collecting the data as data is not a good in itself. 

Remember that data collection isn’t free, and to make impactful decisions, a high volume of data is needed. So, the first step is finding the problem and thinking of how the data can solve that problem.

Once that’s done, define the success criteria – how exactly will you review whether the problem has been solved? If you don’t have predefined success criteria, it becomes immensely easy to twist everything to fit your expectations or change criteria as you go. Some minor changes might be okay, but it’s often best to start with a specific set of goals and change them only if absolutely vital.

Finally, choose the right solution and tactics together with a trustworthy partner, such as Oxylabs. There are a lot of invisible pitfalls when it comes to web intelligence. For example, some websites might display irrelevant or fake data if they suspect that botting is happening. Reputable providers will help you avoid these pitfalls and focus on data analysis instead of fixing issues.

Comments
To Top

Pin It on Pinterest

Share This