• Data Architect
  • India

    Full-time /
    Product Development
    We are looking for a Data Architect to be part of the product engineering team. The candidate should have over 12 years of experience in technologies like  AWS S3, Redshift, IoT, Python/Java, Lambda, DynamoDB, Glue, Athena, Quicksight, API Gateway, RDS, ALB, Kinesis, Managed Kafka.

    We build breakthrough software products that power digital businesses. We are an innovative product development partner whose solutions drive rapid revenue, market share, and customer growth for industry leaders in Software and SaaS, Media and Publishing, Information Services, and Retail. Our key differentiator is our Product Mindset. Our development teams focus on building for outcomes and all of our team members around the globe are trained on the Product Mindset’s core values – Minimize Time to Value, Solve For Need, and Excel at Change. Our teams apply this mindset to build digital products that are customer-facing and revenue-generating. Our business-minded approach to agile development ensures that we align to client goals from the earliest conceptual stages through market launch and beyond.

    In 2021, 3Pillar Global India was named as a “Great Place to Work” for the fifth year in a row based on how our employees feel about our company, collaborative culture, and work/life balance - come join our growing team

    Roles & Responsibilities

    • Discovery of different system components 
    • Creating POCs, Consulting to the stakeholders 
    • Creating the high & low level Architecture (enterprise-wide integration) 
    • Proposing solutions to enterprise business problems specific to data, target state design, industry best practices, deep platform specific expertise
    • Designing datalakes & datawarehouses 
    • Can define the Data pipelines, transformation, consumption, visualization, storage for historical, real time & near real time incoming data
    • Continually reassess current state for alignment with architecture goals and best practices
    • DB designing & modeling
    • Building Python based ETL processes
    • Logging and instrumentation of pipelines and services
    • NFRs: Diagnosis, troubleshooting, prototyping of performance, security, reliability, reusable components, scalability, debugging & monitoring skill
    • Apply or recommend best practices in architecture, coding, API integration, CI/CD pipelines
    • Solve problems at architectural level.
    • Flexibility in working with US based Teams and helping with the needed time overlap

    Technical Competencies

    • AWS S3, Redshift, IoT, Lambda, DynamoDB, Glue, Athena, Quicksight, API Gateway, RDS, ALB, Kinesis, Managed Kafka
    • SQL & NoSQL Databases, Postgres file catalog
    • Python based Workflows and SDK, PySpark 
    • Should be able to write ETL in Python / Java / Scala
    • Hadoop framework, Spark, StormSnowflake
    • MDM (Master Data Management)


    • A competitive annual salary based on experience and market demands
    • Work from Anywhere Policy
    • Flexi-timings
    • Medical insurance with the option to purchase a premium plan or HSA option for your entire family
    • In-house Food & Refreshments
    • Regular Health check-up camps arranged by the company
    • Recreational activities
    • Business casual atmosphere
    • #LI-Remote
    • #LI-DV1
  • Submit your application
    Import LinkedIn profile