Dastgyr is on a mission to simplify the inefficient retail supply chain by connecting retailers with suppliers via technology. It is empowering the often non-serviced areas of the retail landscape by offering extensive inventory choices, transparent pricing, and next day doorstep delivery -- all with a few clicks from a smartphone. Dastgyr aims to positively impact the country’s economy by helping offline grocery retailers, who contribute significantly to Pakistan's GDP.
About the role:
The role of Data Engineer exists to support the software developers and database architects on data initiatives and ensure optimal data delivery architecture is consistent throughout ongoing projects. The incumbent will be self-directed and comfortable supporting the data needs of multiple teams, systems, and products, and will be expected to design and optimize Dastgyr’s data architecture to support the next generation of products and data initiatives.
- Analyze and organize raw data,
- Build data systems and pipelines,
- Evaluate business needs and objectives,
- Interpret trends and patterns,
- Conduct complex data analysis and report on results,
- Prepare data for prescriptive and predictive modeling,
- Build algorithms and prototypes,
- Combine raw information from different sources,
- Explore ways to enhance data quality and reliability,
- Identify opportunities for data acquisition,
- Develop analytical tools and programs,
- Collaborate with data scientists and architects on several projects,
- Use data to discover tasks that can be automated,
- Deliver updates to stakeholders based on analytics.
- Advanced working knowledge of SQL and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases,
- Experience with data warehouses such as BigQuery and Snowflake,
- 3+ years of experience with Python, and data visualization/exploration tools such as Metabase and Tableau,
- Familiarity with the AWS ecosystem, including RDS and RedShift,
- Excellent communication skills with expertise in explaining technical concepts to non-technical audiences.
- Comfort working in a dynamic, research-oriented team with many concurrent projects,
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow etc.
Good to have
- Experience in Big Data technologies like Hadoop and Kafka
- Familiarity with ML frameworks and libraries such as TensorFlow, Spark, PyTorch and mlpack
What We Offer:
- The opportunity to work alongside an exceptionally talented team,
- An exciting and fast-paced business directly impacting Pakistan's economic landscape,
- Market competitive remuneration,
- Flat hierarchies, with short and open channels of communication,
- Unlimited paid leaves,
- Employee stock options,
- Training budget to help upgrade skills,
- Performance-based bonuses,
- A meritocratic and rewarding work culture built around core values of collaboration, growth, and execution.