Data Architect 21 views

Cyient is a global engineering and technology solutions company. As a Design, Build, and Maintain partner for leading organizations worldwide, we take solution ownership across the value chain to help clients focus on their core, innovate, and stay ahead of the curve. We leverage digital technologies, advanced analytics capabilities, and our domain knowledge and technical expertise, to solve complex business problems.

With over 15,000 employees globally, we partner with clients to operate as part of their extended team in ways that best suit their organization’s culture and requirements. Our industry focus includes aerospace and defence, healthcare, telecommunications, rail transportation, semiconductor, geospatial, industrial, and energy.

Job Description

About the Role:

We are looking for a data architect/engineer who will join us at our Hyderabad location and perform data architecture functions for digital solutions. You will support the design and development of how our data is architected, stored and accessed to produce insights.

You Will:

  • You have to improve our data and data pipeline architecture, and optimising data flow and collection for teams.
  • 5-8+ years of experience as a Data Engineer.
  • You will support software developers, database architects, data analysts and data scientists on data projects and will ensure data delivery architecture is consistent throughout ongoing projects.
  • Create data pipeline architecture.
  • Gather business requirements, analysing business needs, defining the BI/DW architecture to support and help deliver technical solutions to complex business and technical requirements
  • Comfortable working with team members at all levels throughout the business.
  • Harmonize our data architecture from Power BI Report Server with Power BI Cloud.
  • Architect, develop and test scalable data warehouses and data pipelines.
  • Design and develop scalable ETL processes, including error handling.
  • Create databases, advanced data modelling, star and snowflake schemas.
  • Write scripts for stored procedures, database snapshots backups and data archiving.
  • Prepare data structures for advanced analytics and self-service reporting.
  • Conversant in Agile development methodology using ALM tool like Jira

You’ll Need:

  • Experience in DBMS like MS SQL Server, MySQL, PostgreSQL, DynamoDB, Cassandra, MongoDB.
  • 5+ years’ experience with MS SQL, OLTP and OLAP database models.
  • Migrating onprem datawarehouse/datamart to modern cloud data warehousing
  • Strong knowledge of Power Query, DAX, and Data Visualization skills(PowerBI/ Tablueau)
  • Experience understanding database performance and tuning impact.
  • Extensive experience designing and developing of large Enterprise data warehouses and reporting systems for a large-scale BI user community.
  • Must have good exposure to Azure tools and services centered around data and analytics (Data Lake, Data Factory, Azure Databricks, Azure SQL).
  • Understanding Data Security on Microsoft Azure – DR and Backup mechanism on Azure
  • Comfortable interacting with stakeholders at all levels throughout the business.
  • Proficiency in the development and administration of : Back-end data integration and architecture including dimensional data modelling, database design, data warehousing, ETL development, and performance tuning – Front-end reporting and analytics platforms including OLAP cube design, tabular data modelling, Power Pivot, Power View, Power BI Report and Dashboard development.
  • Advanced data modelling, star and snowflake schemas.
  • Writing scripts for stored procedures, database snapshots backups and data archiving.
  • Preparing data structures for advanced analytics and self-service reporting.
  • Big Data engineering and Cloud technologies: Elastic Stack, AWS, Google Cloud, Apache Stack.
  • Expert in Query and program languages MS SQL Server, T-SQL, PostgreSQL, MY SQL, Python, R.
  • Skilled in developing a wide range of APIs
  • Good to have Experience into BW Warehouse and Planning(RSPLAN, FOX, ABAP Functions)

Secondary Skills:

  • Experience with data pipeline and workflow management tools: Azure Monitor, Azkaban, Luigi, and Airflow
  • Experience with stream-processing systems: Spark-Streaming.
  • Reporting tools: Power BI, tableau

Good to have:

  • Experience with AWS cloud services: EC2, EMR, RDS, or Redshift
  • Understanding of messaging infrastructure like event hub, service bus and notification hub.
  • Experience with stream-processing systems: Storm
  • Bachelor’s/Master’s degree in Computer Science, Mathematics, Operation Research or Statistics
  • 5 or more years of experience in related field

You will report to the Technical Lead for Data Engineering and Analytics

Skills & Experience Big Data Architecture, Big Data Processing, Data Engineering

Cyient is an Equal Opportunity Employer.

Cyient recruits, employs, trains, compensates, and promotes regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender, gender identity or expression, veteran status, and other protected status as required by applicable law. We are proud to be a diverse and inclusive company where our people can focus their whole self on solving problems that matter.

More Information

Only candidates can apply for this job.
Share this job
Company Information
  • Total Jobs 479 Jobs
  • Location INDIA

Contact Us