Location : Pune, India
iCEDQ is an industry leading DataOps Platform for Data Testing and Production Data Monitoring in ETL, Data Warehouse, and Big Data systems.
- Develop and maintain the core engine of the product (iCEDQ). The engine is the bedrock of the product. It is written in core Java and uses different frameworks to be high performing and scalable.
- Understand and implement connectivity to databases, file formats, cloud data warehouses, cloud Apps, and cloud APIs.
- Work with the Architect to design and develop features in the engine within stated schedules and also ensure it continually improves performance and scalability on production workloads.
- Come up with optimal designs to ensure that the engine stays true to its original goals.
- Be able to independently do RCA on problem statements and also provide optimal resolutions.
- Continually work with performance engineering to ensure that the engine is exceeding or, at minimum, delivering to stated performance and scalability goals with an emphasis on high volume data processing.
- Deliver the engine to QA in such a form that it is always testable via automation testing frameworks.
- Ensure that the codebase of the engine is meeting stated benchmarks set with the help of both static and dynamic analysis tools.
- Work with the Architect to implement and enforce design and code quality throughout the development team, via design and code reviews as well as ensuring that the overall code delivery pipelines are functioning well (CI/CD).
Between 10 – 15 yrs of hands-on technical experience. This is a purely technical role and being hands on in core Java is a mandatory requirement. Must have delivered a solution/product written in core Java in a lead capacity (i.e., leading and owning a reasonably large portion of the product or solution). Having been a team lead is not mandatory.
Must have skills:
Core Java, comfortable using SQL as and when required. Be able to balance time to market along with technical mandates – hence have excellent scoping and estimation skills. Should be comfortable working in new technical areas (with the Architect) and deliver to stated goals. Must have stellar debugging skills and should be familiar with both Windows and Linux OSes from programming and OS knowledge perspectives.
Expert in SQL, Apache Spark, Java Akka, Strong knowledge of RDBMS, and file formats (XML, JSON, Avro, Parquet, ORC, etc.), experience on NoSQL DB’s and cloud platforms (AWS, Azure, GCP).
Nice to have:
Hands-on on any other programming language (Python, C++, or other) is a definite plus. Awareness of scripting languages like Groovy. Understanding of streaming technologies (Kafka). Knowledge of ETL, Data Pipelines, Data Warehousing.