Trending February 2024 # Exclusive Interview With Sonny Laskar – Kaggle Master And Analytics Vidhya Hackathon Expert # Suggested March 2024 # Top 5 Popular

You are reading the article Exclusive Interview With Sonny Laskar – Kaggle Master And Analytics Vidhya Hackathon Expert updated in February 2024 on the website We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested March 2024 Exclusive Interview With Sonny Laskar – Kaggle Master And Analytics Vidhya Hackathon Expert


What’s the key to cracking data science competitions? How do you use this experience to break into the data science industry? We regularly come across these questions from aspiring data scientists wondering how to make a name for themselves in data science.

Who better to answer these questions and provide an in-depth insight into the data science world than a Kaggle Master and a Analytics Vidhya hackathon expert? Ladies and gentlemen, I’m delighted to present Sonny Laskar!

Sonny is a MBA post-graduate from IIM Indore, the place he credits for starting his data science journey. So for any of you wondering if it’s possible to make a career transition to data science from a non-data science field – this article is for you.

I found Sonny to be a very approachable person and his answers, as you’ll soon see, are very interesting, knowledgeable and rich with experience. Despite holding a senior role in the industry, Sonny loves taking part in data science competitions and hackathons and regularly scales the top echelons of competition leaderboards.

Sonny also holds a lot of experience in the data engineering side of this field. As you can imagine, there is a LOT we can learn from him. I had the opportunity to pick his brain about various data science topics and bring this article to you.

We covered a variety of data science topics during our conversation:

Sonny’s background and his first role in data science

The difference between data science competitions and industry projects

Sonny’s framework and approach to data science competitions

And a whole lot more! There is SO much to learn from Sonny’s knowledge and thought process. Enjoy the discussion!

Sonny Laskar’s Background and First Role in Data Science

Pranav Dar: You are currently the Associate Director of Automation and Analytics at Microland, finished 4 times in the top 3 in AV’s hackathons, and hold a runner-up finish in a Kaggle competition. It’s been quite a ride! How and where did your data science journey begin?

Sonny Laskar: My Data Science journey started when I was pursuing my MBA from IIM Indore. Analytics was the go-to area for every aspirant. One of the early topics of discussions was based on how Target figured out a teen girl was pregnant before her father did. This made me very curious and I started to deep dive into the world of Data Science.

I had already worked extensively with data but mostly around engineering problems and business intelligence. No serious machine learning stuff was popular back then with organizations in India.

“I spent two months at the University of Texas, Austin in early 2014 and was surprised by the level of maturity they had with data. My visit to Dell’s headquarters in Austin and how they used social media data to enhance their product positioning was amazing. By the end of this, I was completely convinced that I needed to work on this.”

PD: Your professional career didn’t start off in data science. The first 6 years or so were spent on data warehousing and infrastructure. So what kind of challenges did you face when you were getting into data science? How did you overcome them?

SL: I started my career in 2007 in the world of IT Infrastructure. In the initial six years, I was primarily working on building massive scale data warehousing applications (processing ~10TB data every). The focus was more on ETL and BI. Dashboards and Data marts were the primary output of all these efforts. This was what we called “Descriptive Analytics”.

By 2014-15, “Predictive Analytics” was already getting a lot of attention and adoption in the US. It was then that many organizations in India started looking at “Predictive Analytics” with significant focus. We were already processing Terabytes of data and were very well versed with the engineering side of things.

I was able to understand the fundamentals of Data Science very well since my Mathematics and Statistics concepts are strong and I had a fair exposure to programming.

I started with R since that was the programming language popular in academics and improved my understanding by practicing writing code and replicating other work.

During my MBA, I got a bird’s eye view of many statistical and Data Science approaches. Since the focus during MBA was more on business, it didn’t allow me to master the technical skills as much as the industry needs. Post my MBA, I started spending roughly 4-5 hours every day writing code and building on top of it.

I have already written enough code in the past in Bash, Javascript, PHP & Perl. So, the learning curve was not very steep for me. I also invested in getting access to cloud subscriptions so that I could play with large volumes of data. I think it’s worth investing that money when you believe it is going to be helpful in the long term.

Patience, Perseverance & Practice has been my thumb rule for everything in life, which was what I applied here as well.

Industry Experience versus Data Science Competitions

PD: We often hear from hiring managers how aspiring data scientists participate in hackathons and competitions and struggle to bridge the gap during their transition into an industry role. You have been on both sides of this – you hold rich experience in data science and have excelled in hackathons. What has been your experience in the industry vs. hackathon debate?

One of the best ways that work is establishing credibility by participating in data science competitions.

Just like most things in life, competitions have their pros & cons. There is a lot of preparatory work that gets done before a competition is published. That work is at times extremely complex, time-taking and needs multi-domain understanding.

Similarly, the competition ends with a leaderboard score without any view on what was done with the winners’ solutions. These are grey areas for many first-timers into Data Science which creates a lot of issues when they join the industry.

I have conducted at least 100 in-person interviews in the last year and I can see this struggle very prominently. Data Scientists are not expected to just design a machine learning model to predict something. In many organizations, discussions in meeting rooms end up with a task for the Data Scientist such as “Let us build a model to predict X”.

A good Data Scientist might end up concluding that many such X use cases should not be solved at all with machine learning! A Data Science team is not expected to be very large in the real world. They might get involved in many tasks which are either not valuable or can be easily solved without using Machine Learning.

If they feel it can be solved with Machine Learning, then there must be a series of discussions to understand what data would help them address that.

“Unlike competitions, nobody gives you two .csv files called train and test and a nicely written evaluation metric. Almost 80% of the efforts go into defining the problem and getting and processing data. Remaining 20% effort goes into pure modeling and deployment.”

Exposure to competitions helps address a few parts of this:

Processing data and feature engineering

Building different types of models and getting the best score

These are very significant activities and hence recruiters use “competitions” as a good filter to focus on a smaller set of candidates.

To summarize, below are the key issues which competition focused people face when they join the industry:

Building a business acumen for understanding how a problem statement helps the business goals and what data drives that

Having a problem solver attitude

Understanding the software engineering side of production deployment

Story-telling: Ability to communicate the results to non-technical folks

Data Science Hackathons and Competitions

PD: Ever since data science started becoming mainstream in the last 5 years, multiple competitions keep happening across platforms simultaneously. How do you pick and choose which data science hackathon or competition you’ll participate in?

SL: I was hooked to data science competitions back in 2024. I used to participate in as many competitions as I could! Lately, my personal interest has kind of plateaued as incremental learning has diminished. Now I participate only if I have time and a very interesting problem.

I also try to participate in offline hackathons along with my Kaggle Grandmaster friend Sudalai Rajkumar (SRK). I usually participate based on three factors:

The novelty of the problem: If the problem statement is something new to me from an existing or new domain which I might not have enough experience in, I would like to play with the data as it helps me build some perception on that problem/domain

Data size: I love problems where the data size is extremely large. I like the kick I get when I run models on machines with 500 GB RAM and 64 Core processors. It is a lot of fun!

Multiple scheme of approaches: If there are multiple techniques I can experiment with. In fact, our first Kaggle competition needed us to perform both Text Analytics & Image Analytics and a clear way to merge both

PD: How should a beginner go about participating in these data science hackathons? Which kind of competition should they first dip their toes into?

SL: As a beginner, it is important for folks to know the basic building blocks.

They should start with relatively easy data science competitions. Below is what aspiring data scientists should do in the initial few weeks:

Understand the data well. Do not get directly into running xgb.train

Read about what transformations are effective for your problem & model:

Example: Does one-hot encode help or numeric labeling is better? Does the column have too many categories? Can we reduce them? Is that numeric field really a number or a category?

Feature Engineering is key and your early learning on feature engineering will come from other people’s code. So, build a practice of reading others’ code line-by-line and replicate it. Ask yourselves questions like why did the author do that, and how does that help?

Kaggle kernels are an excellent place to read

On Analytics Vidhya, participants upload their code which beginners should read

Get familiar with the process of building models using different algorithms

PD: How should aspiring data scientists approach a competition?

SL: As we participate in many competitions, we realize that there are a common set of steps that we always follow. We should try to create a template out of it which we can easily modify in every competition. This makes life simpler.

I follow the below process:

Build a naïve base model using all features and basic feature engineering

Record each change and score in an excel sheet to track progress

Do hyperparameter tuning by hand (without spending too much time) to get something decent

Go back to data understanding and rework the features completely

Explore the data, build visual plots to see the patterns, etc.

Read discussions, kernels, etc.

Repeat all these steps

Data Science Industry-Related PD: What are 3 critical aspects of a data science project which you feel are often overlooked by newcomers?

SL: Interesting question. Here is what I would recommend focusing on:

Taking Models to Production:

In the real world, taking models to production takes a lot of effort. There are many things that data scientists need to do from a software engineering perspective, like building Docker containers, setting up a CI/CD pipeline, exposing REST APIs for prediction, version control, etc.

Understanding the Importance of SQL:

SQL is that one thing that every data scientist should learn irrespective of which programming framework they use. SQL is something they would end up using for sure

Learning to write efficient code for Big Data:

Badly written code might not be a problem when working on a small dataset. But it becomes a show-stopper when we run it against large datasets. Such scenarios can be handled by making changes. For example, if you use “for-loops” in your code, then it can be very slow when it has to iterate over a long list. Instead, use lambda architecture. There are many functional programming guidelines that need to be followed

PD: AutoML is coming up huge in the industry. What are some other trends in data science we can expect to see in the next 2-3 years?

SL: AutoML will eventually automate most of the model building & model deployment part of the work. This will include dealing and working with feature engineering (to quite an extent).

“Importance of domain knowledge, logical reasoning, and having a problem-solving attitude is all that Data Scientist would be expected to excel at.”

Other key trends that I see:

Adoption of Graphs in Machine Learning: Most folks do not use Graph. That’s a travesty! Graphs are such amazing structures for solving many complex problems

Augmented Analytics: Augmented Analytics automates data insight by utilizing machine learning and natural language to automate data preparation and enable data sharing

Autonomous Systems: Autonomous Systems are like Driverless Cars which can take decisions on their own. Reinforcement learning is behind this. One of the products we are building in Microland is for “Autonomous IT” which will replicate what a human does when there is a problem and learn that behavior to replicate it in real time

Rapid Fire Questions: Sonny’s Take on Various Data Science Aspects PD: Tell us 3 things you have learned working in data science.

SL: There are too many to list down! But here are my top 3 picks:

Domain Knowledge is key

Being “Jack of Many Trades” helps a lot

Always think out-of-the-box

PD: Which is your favorite machine learning/deep learning algorithm and why?

SL: I use Xgboost & Lightgbm for most of my tasks. They work almost every time. For deep learning, Keras with TensorFlow seems perfect to me.

PD: Which data science professional would you pick to take part in a high-stakes data science competition?

SL: Sudalai Rajkumar (SRK) any day!

SL: Here are a few tips from my experience:

Do not try to learn two languages at the same time. Master any one which you like. Ignore all the news that you hear like “Language X is better than language Y”, etc.

Build a decent Github profile with all the different types of problem you have tried to solve

Take an open problem where you can get data and build some Data Science application around that

Finally, participate in competitions and make it to the top!

End Notes

I thoroughly enjoyed interacting with Sonny Laskar for this interview. His knowledge, his thought process and the way he articulates and structures his thoughts is something we can all learn from.


You're reading Exclusive Interview With Sonny Laskar – Kaggle Master And Analytics Vidhya Hackathon Expert

Exclusive Interview With Sanjay Kumar, Co

1. Kindly brief us about the company, its specialization, and the services that your company offers.

Spyne is a Deep Tech startup helping businesses and marketplaces create and upgrade high-quality product images and videos at scale with AI. Creating good catalogs today is a manual, expensive, and time-taking process. As a result, less than 10% of the sellers today on eCommerce marketplaces create catalogs. Spyne is building the industry’s first AI-powered visual cataloging platform that enables businesses to create studio-finish images 500x faster, at 1/4th cost & at scale. You don’t need expensive studios or photography skills, nor have to follow any complex, time-consuming processes. With Spyne, one can create stunning catalogs within minutes, driving approximately 40% better conversions. Currently, we are serving 80+ customers including MFC, Volkswagen, Megadealers, Sell Any Car, Karvi, Amazon India, Flipkart, etc., across 15+ countries in the automotive, food, and retail verticals.

2. With what mission and objectives, the company was set up? In short, tell us about your journey since the inception of the company? 3. Brief us about the founders of the company and their contributions to the company and the industry.

I, being the Co-Founder and CEO, am leading the strategic goals of the company and constructing the company’s roadmap. I am also leading the product, tech, AI, and business teams. In the past, I have held leadership positions and led major initiatives at Amazon, Oyo, Fashion & You, Rocket Internet, and Yatra.  Keeping customers and innovation at the heart of everything I have launched multiple successful products used by millions of customers globally.

4. What is your biggest USP that differentiates the company from competitors?

We are enabling businesses to automate their entire cataloging process (shoot, edit, and publish) by replacing their traditional processes with cutting-edge AI technology. We standout in this area as our AI delivers accurate results and can help businesses infinitely scale, 500X faster at 1/4th the cost.

5. Kindly mention some of the major challenges the company has faced till now.

One of the earliest challenges we faced was how to train our AI models to give accurate outputs. To get near-studio image outputs with AI, we needed to solve multiple problems, each of which required focused resolution and data training.

For example, how do we bring out the best image output of a car if the car is not shot from the correct angle or if the light is too low when the car is shot in the evening post sunset. There were also issues of surrounding objects and trees showing through the see-through windows of the cars, reflections of buildings/ trees on a car’s body, etc.

All of these were real challenges and required focused training data that photographers needed solutions. Getting a huge volume of images for each of these specific issues was extremely difficult. Therefore, we set up a data annotation team, which helped us generate focused data for each of these problems from our ongoing client projects. This strategy helped us improvise and train our AI/ML models to make them more efficient and accurate over time. As a result, our models improved their outputs from about 60-70% accuracy at the beginning to 99%+ accuracy.

6. Please brief us about the products/solutions you provide to your customers and tell us how they get value out of it.

We are helping businesses and marketplaces create studio-finish product catalog images at scale with our industry-first, computer vision, and AI-powered solutions. Our products leverage deep learning to understand the visual intent far more quickly than humans.

Through our products such as Whitelabel apps and SDK solutions, we are reducing the 2+ weeks of catalog photography process to just a few minutes. This decreases the product’s time-to-market by 80% allowing them to scale faster and efficiently.

We also have a web tool, called Spyne Darkroom that helps AI validate & edit the existing images that sellers have and instantly convert them to high-quality images for better engagement & conversions. This solution is helpful for SMB businesses who have limited SKUs and wish to change their products’ look and feel for online selling.

Our AI Editing technology is built for scale, typically editing 500+ images per minute. Enterprise and marketplaces opt for API integrations to transform their catalog at scale at nearly 1/4th the cost.

7. How do you see the company and the industry in the future ahead?

Ecommerce cataloging is a huge $40 billion+ industry. Right now, we have enterprise first approach. We want to launch our product for B2B2C at least for specific categories (automotive, eCommerce, fashion, food) in the next year. Our long-term goal is to integrate our AI camera application via SDK integration with all the major marketplaces in India and abroad.

8. AI is projected to be the next market. How is AI contributing to the making of your products and services?

The second problem we are solving is how to convert these smartphone-shot images to match up to the studio-quality images for online selling.  With AI, we are automating the entire manual editing process to replace and add backgrounds, autosize, colour correct, add shadows, etc., and get the final images within seconds.

9. What are your growth plans for the next 12 months?

Currently, Spyne caters majorly to enterprises and marketplaces. We have a major clientele in the automotive, eCommerce, and food industries. However, going ahead, we aim to soon launch our services for the B2B2C segment specifically for automotive dealerships, e-Commerce, and fashion businesses. Edging forward into the future, we have big plans to expand into Metaverse and Omniverse with the AR/VR technology that we are parallelly developing.

Exclusive Interview With Paolo Ardoino, Cto, Bitfinex

Gone are the days when the concept of cryptocurrency was looked upon with suspicion and considered detrimental to the conventional financial system which stands as a symbol of stability and security. Common people are getting more interested in investing their hard-earned money into cryptocurrencies, thanks to the ever-growing breed of accessible and versatile independent crypto exchanges allowing investors to trade in virtual currencies. Bitfinex is one such crypto exchange that believes in providing the investor with an unparalleled crypto investment experience. Analytics Insight has engaged in an exclusive interview with Paolo Ardoino, CTO, Bitfinex.

1. Kindly brief us about the company, its specialization, and the services that your company offers.

Bifinex is a cryptocurrency exchange that is at the forefront of technological innovation in terms of digital token trading. Our mission has always been to give our users the ultimate cryptocurrency trading experience. As such, we continuously strive to provide our users with state-of-the-art trading tools, innovative technology, and unparalleled levels of customer service.

2. With what mission and objectives, the company was set up? In short, tell us about your journey since the inception of the company?

Since Bitfinex was founded in 2012, our team has gained invaluable experience in blockchain technology while also cementing our position as the go-to place for digital asset traders and institutions to trade.

The digital asset space is evolving at a breakneck pace and keeping up with such rapid technological innovation requires an equally forward-thinking and agile approach. Bitfinex provides state-of-the-art digital asset trading services for our users and global liquidity providers. We firmly believe that the best cryptocurrency trading experience should be available to everyone.

3. Kindly share your point of view on the current scenario of Big Data Analytics and its future.

Today, big data is everywhere. You have cars basically selling traffic data to Google. You also can have lightbulbs with a WiFi chip. It’s very exciting. In the blockchain space, you have things like the Lightning Network.

This is a bitcoin second-layer solution that employs micropayment channels to increase the blockchain’s ability to process transactions more quickly. In a nutshell, the Lightning Network enables users to send bitcoin in a matter of seconds at a fraction of the cost. There have been various use cases of how the Lightning Network is being combined with AI and big data to spearhead innovation. We have seen self-driving cars, which rely on a combination of AI-generated data, begin to share data via an integrated network based on the BTC blockchain. With the emergence of the Lightning Network, the transmission of information on a secure and immutable network becomes possible. In the future, AI platforms will be able to communicate data more efficiently, allowing AI systems to better assess their surroundings in real-time.

Overall, the future of big data points to more digital growth, as almost every industry will come to depend on how information is stored, processed, and applied.

4. How is IoT/Big Data/AI/Robotics evolving today in the industry as a whole? What are the most important trends that you see emerging across the globe?

The Internet of things (IoT) is a major trend at the moment. For example, we now have the technology where your fridge can detect that you’re out of milk and make an order for some milk. The capacity for devices to communicate with each other in the home is growing exponentially. IOT’s growth will play an important role as we move toward living in the homes and cities of the future.

5. Please brief us about the products/services/solutions you provide to your customers and how they get value out of it.

In Bitfinex Pay we’ve created an intuitive and seamless way for online merchants to receive payments in crypto. Bitfinex Pay enables merchants to be easily equipped to support crypto payments as increasing numbers of consumers become more comfortable with paying for goods and services using digital tokens. Bitfinex also provides our customers with the capacity to lend out their crypto and take out a loan using cryptocurrency that can be converted into fiat.

Meanwhile, Bitfinex Securities, the securities platform of the exchange, is providing a better way to raise capital or list securities than traditional exchanges.  As a leading exchange in the blockchain space, we are keen to support blockchain-related projects but the platform is not in any way limited to doing so. Bitfinex Securities provides the opportunity for small companies to access funding markets when that route je not available to them in traditional finance. Through blockchain-based securities, it provides the capacity to significantly reduce listing costs and streamline processes. It also increases accessibility to securities products among our member base. The monumental leap that the blockchain represents is an opportunity for new entrants to compete against the incumbent stock exchanges.

6. What role has Bitfinex played in the innovations of new technologies?

At Bitfinex, we are extremely focused on innovation. You don’t have to look much further than our innovative product offerings. For example, Bitfinex Pay is disrupting the traditional payments industry. By furthering the use of crypto for transactions for payments and services, Bitfinex Pay is at the vanguard in a revolution for money and payments. We also have Bitfinex Securities which is disrupting the traditional exchange trading model for securities. Unlike exchanges in traditional finance, which are more geared to catering to larger and more established corporations, Bitfinex Securities is set to become a hub for innovative tech startups. It recently helped to raise Euro 6.75 mn through the issuance of the Blockstream Mining Note. This is unprecedented, and Bitfinex Securities offers capital raising and trading opportunities to a wide range of companies, that would not be able to get access to traditional capital markets.

7. The industry is seeing the rising importance of Big Data Analytics and AI. How do you see these emerging technologies impact the business sector?

   The rising importance of Big Data Analytics and AI will undoubtedly impact the business sector. The digital era has created an overwhelming amount of information which has proven to be immensely valuable to businesses. An increased capacity to analyze large data sets quicker and more efficiently, will not only improve business efficiency but also provide valuable insights needed to make better decisions.

8.  What are some of the challenges faced by the industry today?

Exclusive Interview With Sandeep Mukherjee, Director, Fluent Commerce India

When you place orders for multiple products with an eCommerce platform, you likely end up spending more time tracking the orders than you do buy them at nearby stores, to the extent that you lose faith in the efficiency of the eCommerce website. Inventory management through a distributed order management system is what most of these eCommerce platforms require to overcome supply chain management challenges. Fluent Commerce India is a cloud-based order management system (OMS) that enables an Omni-channel strategy for quick and foolproof delivery of goods ensuring the best order management experience. Analytics Insight has engaged in an exclusive interview with Sandeep Mukherjee, Director, Fluent Commerce India.

1. Kindly brief us about the company, its specialization, and the services that your company offers. 2. What is your biggest USP that differentiates the company from competitors?

Our platform makes it seamless and convenient for retailers to change their fulfillment strategy, providing them with the ability to react quickly to changing circumstances, without costly IT intervention. This combined with our pre-built workflows and low code Order Management Experience (OMX) platform sets our services apart in the industry. Fluent Order Management allows merchants to be fast and agile.

3. Kindly mention some of the major challenges the company has faced till now.

Our biggest goal at the moment is educating – why fulfilling orders via an ERP or commerce platform isn’t enough anymore. Companies have been managing orders and fulfilling them through their shopping platforms, such as warehouse management, or ERP management. Our focus is currently driven towards making customers aware of the importance of a distinct layer of OMS since there are two types of retailer professionals: those who lack awareness of why they require a distributed order management system and those who don’t. Now those who do understand, have been managing order fulfillment logic within ERP or warehouse management systems, and have become accustomed to the fact that it is now difficult for customers to take the logic that they have taken from ERP and warehouse management and put it to our OMS layer. This is happening with most enterprise-level customers.

4. Please brief us about the products/services/solutions you provide to your customers and how they get value out of them.

At Fluent Commerce, we are constantly striving to provide our customers with comprehensive solutions for efficient and easy commerce. Our Cloud-native platform is designed for the future growth of modern architecture. We offer the following services.

Web apps- Customise the user experience to improve employee satisfaction and productivity.

Order Management Experience (OMX)- A low-code order management platform that allows retailers to customize and modify UIs and Workflow to match their needs.

APIs- Our customers can connect to any application or channel using GraphQL and REST APIs.

Orchestration Engine- Manage product, inventory, order, and payment updates across all systems.

Scalable architecture: Allows for on-demand scaling, worldwide availability, and zero-downtime releases.

Our system is seamlessly integrated with Adobe, Salesforce, and commerce tools for fluent and fast order management. By using our OMS, it has become easier for companies to improve their product cycle timelines, reduce errors in order fulfillment, and implement better inventory management techniques. Our system helps in tracking real-time inventory or fulfillment of orders and also offers alteration.

5. Tell us how your company is contributing to the IoT/AI/Cloud Computing industry of the nation and how the company is benefiting the clients.

Our platform is MACH certified, which stands for Microservices, API, Cloud, and Headless. MACH is an independent organization that has certified our platform. When the MACH Alliance certifies a platform, it indicates it is not only cloud-ready but also cloud-born. For any cloud-based platform, the platform’s architecture includes scalability and zero-downtime availability. As a result, we are completely capable of connecting with any cloud-based or in-house technology that has recently been deployed by any brand. Because we are MACH certified, any brand that adopts Fluent ensures interoperability. The capacity of computer systems or software to exchange and use information is referred to as interoperability, which refers to any other platform, such as ERP, POS, or warehouse management.  This is how we are contributing to the whole cloud’s interoperability.

Our platform also operates on an event-driven model. Any interaction with Fluent OMS at any level or within our OMS is always recorded as an event. For example, assume a brand is utilizing Fluent on an eCommerce website; when someone places an order, the information comes and hits the Fluent platform; this is referred to as an event. This event now occurs first on the Fluent platform, and as a result, several other events are triggered throughout the Fluent system. As a result, every piece of information is available on the Fluent platform, and once the information is public, any AI engine can work on the data to generate AI. That is how we contribute to AI in the field of order management.

6. The industry is seeing a rising importance of business and technology enablers like virtualization, convergence, and cloud. How do you see these emerging technologies impact your business sector?

In recent years, integrated SaaS platforms have emerged as the ideal one-stop solution for managing e-commerce activities on a single platform. Legacy practices involved locating multiple stakeholders to address interconnected duties and problems. But this wasn’t cost-effective or time efficient.

However, automated order management has ironed out the process, enabling a seamless post-purchase experience and allowing companies to deploy a single unified platform to manage an order from start to finish.

Our SaaS approach enables firms to manage both technology expenditures and the continual technological improvements that a fast-changing industry like e-commerce necessitates.

7. How is your company helping customers deliver relevant business outcomes through the adoption of the company’s technology innovations?

When a company adopts Fluent OMS, it benefits them in several ways, such as:

Accelerated sales growth – It helps with sales growth since Fluent Commerce adds predictability to a brand. In today’s world, every consumer desires a predictable brand. For example, if I am on a brand’s website looking for product specifics, I merely want to know that if I purchase this particular article, what would be the projected time for fulfillment? As a customer, when I know the exact delivery time of a product before purchasing it, my commitment to the brand grows. As a result, two things happen: first, there is a good possibility that I will buy that item because it tells me when it will be available and how long it will take to arrive. And now that I’ve had a positive experience, I’ll return to the website. As you can see, there are two parameters: the Quick ratio and the conversion ratio. So Fluent Commerce helps to accelerate sales by increasing these.

Increases operational margin – We assist the brand in optimizing operating margin by facilitating them in delivering and fulfilling each order in the most optimized manner. The optimal way is the quickest way to complete at the lowest possible cost because most merchants and brands spend a lot of money to fulfill each order. We significantly facilitate this by helping our clients in having optimized operating margins, thereby, improving the consumer experience.

8. What are your growth plans for the next 12 months?

We want to cater to our specialized market, which is currently into retailing, and we will also cater to B2B businesses, from a target market standpoint. To expand our company’s footprint, we are heavily investing in product development, particularly in India. Also, this is a major strength of Fluent Commerce because we launched ourselves in India in October last year and have already begun investing in product development, which is a huge step forward in terms of growth.

9. Which industry verticals are you currently focusing on?  And what is your go-to-market strategy for the same?

We are currently primarily working with retailers and B2C brands.  But we are also starting to talk to B2B businesses. Our go-to-market strategy is straightforward: whenever a brand’s order fulfillment isn’t as profitable or customer oriented as it could be, we tackle the issue and provide them with a superior management solution to assist them to meet their consumers’ expectations.

Exclusive Interview With Akshay Soam, Chief Technology Officer, Seracle

With lots of fans and critics, blockchain, with its apps and blockchain development tools are growing in popularity to unimaginable heights. It is becoming the buzzword where even a normal company changing its name and business model according to the blockchain hype is leading to higher profits. Founded in 2023,

1. What was the business problem? How was it identified and by whom?

Founded in 2023, Seracle enables developers to develop and create apps on blockchain very fast, with the full support of the Seracle blockchain experts. Seracle has quickly grown to become more than a USD $100 million business employing over 100 staff and with offices across India, Singapore, Thailand, and Canada. We work with a wide range of industries, in banking, agriculture, and travel which are increasingly reliant on blockchain technology. Over the last two years, during the pandemic, we have seen a remarkable uptake in the number of people investing in crypto, as well as the use of blockchain tech by other organizations hence keeping pace with this newfound demand, was an important element. This is where our technical team identified some significant issues. As an early-stage start-up, we built a lot of our crypto and web 3.0 infrastructure on our own, using the services of a cloud provider that offered basic capabilities. We knew that we would need to handle and prioritize a wide variety of data types and hence, chose MongoDB as our core database which we managed ourselves. But our team of developers quickly needed extra scalability to support the business’ strong growth. We also needed to overcome the challenges that came with having to manage growing data complexity so we could continue to improve performance and reliability. While an average day would see about 3,000 users on the Seracle platform, this number could go up to between 12,000 and 18,000 new registrations per day following marketing campaigns. It was important our developers could scale up and down on-demand to manage transaction volumes. The development team also had to continually make changes and release updates to keep up with the quickly evolving Crypto landscape. While this was all possible with our existing infrastructure, it required a large investment in developer time and more capital resources. We also had an important analytics and visualization challenge:  getting a holistic picture of all up-to-date data was a pain. We were using Kibana and had to sync all our different data sources into ElasticSearch, just to generate reports. This was time-consuming and created other problems like multiple copies of data and the risk the data would be out of date by the time it was shared. The syncing issues often caused downtime and would make queries inefficient. Finally, Seracle’s initial data infrastructure didn’t allow for different sets of data from different projects, each having different privacy and security requirements based on each country or region, to be aggregated in a single database. As more customers from a wide range of countries were onboarded, we required extra security capabilities and built-in compliance for every conceivable geographic region, but it couldn’t compromise the customer experience.  

2. How did you and your team settle on this solution to the business problem?

Once we had identified the problems, we started figuring out ways to resolve these and make our system much more reliable and efficient. In our quest to be able to support scaling up of clients, we upgraded to

3. What was the build/creation process like? How long did it take? Were there any hiccups or surprises?

We recently migrated one of our marquee customers- to Atlas. In all, the entire migration process took about 40-50 days. The migration was a perfect opportunity for us to revisit its cloud layer and overcome the flexibility and scalability limitations of our cloud provider and we also moved to AWS. The second stage of the migration included the move to MongoDB Charts for analytics. We were handheld by MongoDB through the entire build /creation process and this made the whole process of migration as well as implementation seamless. The team at MongoDB, hosted one-on-one sessions with our technical team to better understand our architecture and gave us suggestions on how to improve our platform. The always-on support from people who know the database inside and out gave us a lot of added value. Developers can now focus on what matters the most: driving innovation and supporting customers instead of just managing the database and issues such as queries, sync, security, compliance, and more. Together, the MongoDB platform, MongoDB Atlas and Charts are helping drive forward one of the most modern and exciting areas of technology, pushing the boundaries of what our platform can offer.  

4. What was the most challenging aspect for you (as CIO) and how did you address that challenge?

Working with data has always been the hardest part of building and evolving applications. And for us, performance and reliability are key. MongoDB has provided us with the flexibility to move fast and simplify how we build data for any application. With MongoDB’s application data platform, our team can use one interface, for any application, and run it anywhere. This provides us with resilience, nearly infinite scalability, and best-in-class security, with multi-cloud deployment flexibility. The MongoDB Atlas platform has helped us immensely in operational tooling and management while freeing up developer time. Another important aspect is the charts, which have added huge value for us by enabling better and easier analytics and access by having everything in one place.  

5. What were the results? Has the project transformed your business in unexpected ways? Has it led to identifying additional opportunities?

We now have a more efficient trading mechanism and can now go beyond 15000 trades per sec with no trouble. This directly translates to better performance and cost savings for Seracle’s clients. Customers are able to make more money because they can process more transactions. The analytics element also results in the creation of more targeted products and a better understanding of customer needs. Compared to Kibana, MongoDB Atlas’s Charts can more easily and quickly handle a large – and exponentially growing – number of data sets. Atlas on AWS provides us with extra flexibility and scalability. That means the company can scale up and down to respond to any promotional activity or anything in a particular region without having any issues or downside.   Well, first I have to say to tech leaders if you’re looking for crypto or blockchain services, there are world-class solutions out there that can radically transform how you do business. Just call me! Working with MongoDB helps ensure we can offer industry-leading reliability, robustness, and security. It makes more sense to plug into an already built solution like Seracle, than build the whole tech stack from the ground up. Companies can reduce costs as well as save precious time in terms of Go-to-Market by using an already existing solution. Doing what we have done from scratch, building your own infrastructure would take at least 3 years and over 40 developers. Thanks to MongoDB we were able to build what we’ve built much faster, and with fewer developers. We plan to onboard more customers in 2023 and will continue relying on MongoDB to power our growth as well as sustain its innovation efforts in an industry that is fast growing.

Exclusive Interview With Rakesh Goyal, Director, Probus Insurance Broker

Probus Insurance Broker Private Limited is one of the leading tech-enabled insurance brokers in India. With its presence across all the states and Union Territories in India and covering more than 400 cities, the company offers insurance products from the top-notch insurance companies of India. Probus operates through a network of more than 25,000+ Point of Sale Persons (PoSPs) and also focuses on Tier 2 & Tier cities. Analytics Insight has engaged in an exclusive interview with Rakesh Goyal, Director, Probus Insurance Broker.

Q1. Kindly brief us about the company, its specialization, and the services that your company offers. 

Probus Insurance Broker Private Limited is one of the leading tech-enabled insurance brokers in India. With its presence across all the states and Union Territories in India and covering more than 400 cities, the company offers insurance products from the top-notch insurance companies of India. Probus operates through a network of more than 25,000+ Point of Sale Persons (PoSPs) and also focuses on Tier 2 & Tier cities. With over 15+ Regional Offices and a workforce spread across PAN India, Probus has a unique technology-based online & offline hybrid business model along with a fleet of dedicated on-ground PoSP across the country who are supported and serviced by the teams at the regional level. Enriched with a digital presence through an online platform, the company enables the team and PoSP to have easy access to all the details under a single login thereby ensuring a seamless solicitation process. Both the mobile application and online portal are not only available to solicit the business but can also be used for pre and post-sales queries. The well-balanced AI & Human resources further ensure a superior experience for the customers. Therefore, Probus Insurance Broker optimizes the journey of a customer through an effective amalgamation of digital experience and a human touch at each stage.

Q2. With what mission and objectives, the company was set up? In short, tell us about your journey since the inception of the company?

Our Director – Rakesh Goyal has always been an avid reader since his childhood. His curiosity to learn new things and analyze everything in detail led him to discover the reality of the insurance market. It hit him hard wherein he realized that the actual benefits were unknown to the society. So, his primary objective was to build an insurance company that focuses on customer-centric transparency and provides the necessary assistance. This became a turning point to start with Probus. Therefore, the company’s primary goal has always been to create a proper distribution structure, educate the public about the right product & support them in claims and other related activities.

Q3. Kindly mention some of the major challenges the company has faced till now. 

The gradual transition from the traditional business model to the digital route was one of the major challenges that the company faced. The entire journey of digital transformation involved a list of processes such as setting up the required tech system, incorporating the required tools or software, providing the right training to the different stakeholders for using these tools/software, getting the correct resource or another technological integration aspect, etc and hence it was a quite daunting process at the time. Another major challenge was to seamlessly handle the renewal processes – from underwriting, and onboarding to policyholder services, claims to process, sending renewal notices, providing necessary assistance to partners, customers, etc. However, through the optimal implementation of the Robotic Process Automation (RPA) system, the company was able to deploy new efficiencies in the organization in the most effective manner.

Q4. How do you see the company and the industry in the future ahead?

As innovation is a must in today’s generation we believe in creating an insurance portfolio for all users including existing ones, bringing them under one roof. This will eventually automate the response of the customers and also help to save time on searching the policies. As we are aware that the customers’ requirements keep changing and therefore we need to stay ahead by building a foresight of upcoming needs. That is why the technology would be our significant tool to stay relevant with the changing demands and our portals would be more interactive thereby adding value to the customer’s overall experience. Our other major focus would also be to protect customer data and give them satisfactory ownership of the policy. We also aim to create the necessary awareness (in both rural & urban areas) and change people’s perspective & approach toward insurance.

Q5.  What are your growth plans for the next 12 months?

Improving product offerings i.e. enhancing the product basket, strengthening the distribution channel (digital distribution), and improving operational efficiency would be our primary focus for the coming future.

Update the detailed information about Exclusive Interview With Sonny Laskar – Kaggle Master And Analytics Vidhya Hackathon Expert on the website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!