brand logo
View All Jobs

Data Engineer 1

Tech
Bangalore
Job Description
About LoadShare Networks:

At LoadShare, we are building Indiaʼs largest intra-city logistics marketplace. Founded in 2017, we are a Series C startup backed by a strong set of investors (Tiger Global, Matrix Partners, BII, Stellaris, BeeNext, Filter Capital) focused on building a profitable and impactful business. We work with all major enterprise clients covering the full spectrum of intracity use cases - Quick Commerce (Blinkit, Zepto, BBNow, etc), Food (Swiggy, Zomato, etc), Hyperlocal (Apollo, Licious, etc), Grocery (Reliance, Spencers, ONDC, etc), eCommerce (Flipkart, Amazon, Meesho, ShipRocket, etc) and Bike Taxi (Uber, Ola, etc).  ● Scale: We deliver over ~500K orders per day at the last mile across food, e-commerce, grocery, etc on a common tech platform. We have coverage in over 500+ towns nationwide and a fleet of over 20K+ riders.  ● Strong Tech: We are in a unique position to cross-utilize the delivery boys across different earning opportunities matching the peaks of various use-cases throughout the day. (ex: grocery peaks at 6-8 am, while bike taxi peaks from 8 to 10 am, etc). In addition to this, our supply chain SaaS platforms power over 1M+ shipments per day

As a member of the Data team, you will:-

Analyze, design, develop and support analytics reports and dashboards specially to measure, monitor and alert the business regarding operations and product metrics.
· Have an appetite to learn and be flexible to pick up new technology.
· Deliver strong SQL development and maintenance techniques surrounding data movement.
· Troubleshoot data sanity in complex workflows partnering with the tech and product teams
· Create ad-hoc reports as requested in a timely and accurate manner, assist senior leadership with various tasks and items as needed.
· Present findings recommendations to business and non-technical stakeholders with effective storytelling approaches.
· Partner with internal teams and work on technology in-house projects to bring innovation and process improvements.
· Build ETL pipelines.

 
Job Requirement
Job requirements and key responsibilities:

· Strong SQL skills
· Around 1-2 years of coding in Python (experience in web scraping preferable)
· Experience with data modeling, data warehousing and building ETL pipelines.
· Partner with the business, product and tech teams to understand business domain and create reports, dashboards to support the business growth and optimisation
· Experience in writing and tuning SQL scripts
· Experience with BI tools like Zoho Analytics, Metabase, Redash, Tableau or Power BI
· Collaborate with analytics and businesses teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organisation.
· Able to communicate effectively with all levels of management in a clear and professional manner; verbally and written.