The Good Tech Companies - Real-Time Data Pipelines Reshape Brick-and-Mortar Retail Through Edge Computing Integration
Episode Date: September 11, 2025This story was originally published on HackerNoon at: https://hackernoon.com/real-time-data-pipelines-reshape-brick-and-mortar-retail-through-edge-computing-integration. ... Suhas Hanumanthaiah transforms retail with real-time data pipelines, edge computing, and SAP-cloud integration to cut costs and boost efficiency. Check more stories related to cloud at: https://hackernoon.com/c/cloud. You can also check exclusive content about #real-time-data-pipelines, #edge-computing-retail, #sap-to-cloud-migration, #google-bigquery-optimization, #ecommerce-data-integration, #grocery-outlet-inc., #retail-digital-transformation, #good-company, and more. This story was written by: @kashvipandey. Learn more about this writer by checking @kashvipandey's about page, and for more stories, please visit hackernoon.com. At Grocery Outlet Inc., Suhas Hanumanthaiah modernized data systems by integrating real-time pipelines, SAP-to-cloud migrations, and eCommerce platforms. His ELT framework cut $10K in BigQuery costs, optimized 600+ pipelines, and improved analytics speed by 15%. These innovations boosted efficiency, supported digital growth, and redefined brick-and-mortar retail through edge computing.
Transcript
Discussion (0)
This audio is presented by Hacker Noon, where anyone can learn anything about any technology.
Real-time data pipelines reshape brick and mortar retail through edge computing integration.
By Cushvi Pondi.
In the evolving landscape of technology, the integration of real-time data pipelines and edge computing
is reshaping how brick and mortar stores operate.
Engaging actively with this gradual evolution as Shuh has Hanumanthaya,
a data architect who has actively worked to impact the operational efficiency of grocery outlet
Inc. during his tenure at grocery outlet, Hanumanthaya played an important role in managing the
real-time pipeline of sales data, transitioning from stores to cloud data warehouse, a WS Redshift.
He also contributed to the digital transformation project that migrated a WS Redshift data to Google
Big Query Lakehouse. This shift optimized red queries to reduce cost by one terabyte.
Further, a significant aspect of Hanumanthaya's work and digital transformation involved
integrating SAP systems into the new data architecture. He meticulously mapped data from SAP HANA to over
300 data warehouse tables, ensuring that the transition from legacy systems like his 400 to
SAP-based systems did not disrupt the flow of critical business information. One of the notable
achievements in this process was the development of an ELT, extract, load, transform, framework
tailored for the data warehouse migration. This framework facilitated the efficient handling of change data
capture. CDC, from SAP, a method essential for real-time data replication. By optimizing the CDC
process, Hanumanthaya achieved a substantial cost saving of $10,000 in Google BigQuery billing.
This was accomplished by redesigning how CDC data was processed, reducing unnecessary data scans, and
improving query performance. In addition to cost savings, Hanumanthaya addressed the challenges
posed by the highly normalized SAP HANA data model. Complex process calculation queries were prone
to timeouts in the middleware layer, affecting the performance of critical applications. Through persistent
optimization efforts, he enhanced query performance. Hanuman Thea's expertise also extended to
e-commerce data integration. Hamadled data from SAP to meet the requirements of e-commerce vendors
such as DoorDash, Instacart, Uber Eats, and the grocery outlet app. This integration was vital for
expanding the company's digital footprint and ensuring that online platforms had access to
accurate and timely data. Speaking of integration, he also developed a repeatable ELT framework for
the CDAP pipeline and accelerated the development of pipelines to support 600 plus tables in
less than eight weeks. Furthermore, he was instrumental in building high impact reports, such as sales
margin analyses, leveraging the new source systems, and developing a framework to track if
SAPSLT, an ELT from SAP fails to replicate data into Google Cloud.
This included defining standard operating procedures, SOP, and disaster recovery.
He also developed an e-commerce interface at a fast pace with key data quality checks and
supported business continuity. Beyond technical implementations, Hanumanthaya collaborated with
technical vendors to optimize the usage of cloud resources. He fine-tuned micro-strategy
VLDB, very large database, settings for a WS Redshift, achieving a 15% improvement in query execution
times. These enhancements contributed to more responsive analytics and reporting capabilities.
Reflecting on the challenges faced during these projects, Hanumanthea emphasizes the importance
of building connectors capable of managing CDC when replicating data from SAP HANA to cloud-based
data warehouses or lakehouses. He suggests that, in the absence of such connectors, employing
an intermediary transactional cloud database can be effective. This approach allows real-time applications
to rely on the cloud transactional system, while the lakehouse serves as a repository for
aggregated business reporting. Hanuman Thea also notes that constructing real-time pipelines
can become complex when dealing with high-velocity data. He advocates for thorough testing of
connectors and interface technologies under simulated large loads to ensure reliability.
Additionally, he points out that replication technologies often like native error tracing,
highlighting the need for strong monitoring and testing.
In summary, Shuhas Hanumanthaya's contributions to integrating real-time data pipelines and edge
computing in retail have led to operational improvements at Grocery Outlet Inc.
His work exemplifies how the data architecture is undergoing a modern makeover, enhancing customer
experiences.
This story was distributed as a release by Kushvi Pondi under Hackernoun's business blogging
program.
Thank you for listening to this Hackernoon story, read by artificial intelligence.
Visit hackernoon.com to read, write, learn and publish.
