Master Data and Information Management
Master Data and Information Management
  • Data Lakes

    The growing role of big data and analytics places a certain challenge for enterprises. The exponential growth of information from diverse sources in various formats and quality presents a problem in the management of information and its utilization, and data warehousing itself.

    With the increasing number of customer touch-points and a myriad of new sources of business data, data migration from traditional sources, Web Logs (unstructured data) and Data Warehouses to Hadoop clusters. Setting up Data Lakes is necessitated to leverage Data generated from legacy Transactional systems as well as Digital data from social media in real-time or near real-time.

    Our experts can help you to:
    • Create a reservoir for enterprise, social, and device information
    • Develop a scalable storage, compute, and access data layer
    • Build the agility of absorbing data of any type/scale and processing that data for different business needs
    • Enable data governance and enterprise-wide access controls
    • Reduce the time to access and locate data to accelerate preparation and reuse
    • Convert digital data noise into inspired insights
    • Manage you Master Data Management and Data Integration requirements effectively
    • Help you build crucial and customized data assets to achieve smarter, faster and effective decision making
    • Data Architecture and Engineering
    • Data Quality Management
Master Data and Information Management Expert
Related Work
Fixed Price - Est. Time: 6 months,

Position: NiFi /Big Data Developer 

Must Have Technical Skills: NiFi

Good to have Technical Skills: Python, ETL

Preferred Industry Experience: Telecom

·         Extensive experience on Nifi to setup data pipeline.

·         Hands on experience in using controllers and processors to setup ETL framework in Apache Nifi

·         Extensive experience in Python

·         Food understanding on Spark, Spark Streaming & PySpark.

·         Good understanding of Big Data components

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months  

Fixed Price - Est. Time: 12 months,

SAS Data Quality Consultant:

  • Data Quality developer will be responsible for analyzing and understanding data sources and end-user requirements using the SAS DQ.
  • Must be aware of Data Quality dimensions and their implementation in SAS DQ tool.
  • No of Position-2
  • Role & Responsibilities - Dev and Testing Skills
  • Experience - 3-6 years of experience
  • Location – Delhi NCR
Fixed Price - Est. Time: 12,

We need a proficient PySpark resource with good experience.

Fixed Price - Est. Time: 12 months,

Position: Azure Pyspark

Pyspark + with Data warehousing and Azure will be add on.

·         Must have low level design and development skills.  Should able to design a solution for a given use cases. 

·         Agile delivery- Person must able to show design and code on daily basis

·         Must be an experienced PySpark developer and Scala coding.   Primary skill is PySpark

·         Must have experience in designing job orchestration, sequence, metadata design, Audit trail, dynamic parameter passing and error/exception handling

·         Good experience with unit, integration and UAT support

·         Able to design and code reusable components and functions

·         Should able to review design, code & provide review comments with justification

·         Zeal to learn new tool/technologies and adoption

·         Good to have experience with Devops and CICD

Nos of Resources required: 1 to 2

Work location: Bangalore

Experience: 8 yrs – 9 yrs

Mobilization Period in weeks: 2 weeks

Fixed Price - Est. Time: 12 months,

Position:  SAP Technical Consultant

Must have Technical Skills : SAP XI/PI/PO, ABAP programing

Good to have Technical Skills : SAP User exits (BADIs / BAPIs) ,EDI Technologies, ABAP Proxy, BAP/RFC/IDOC/ Web Services/ IDOC enhancements, SAP AIF and Webmethods

Good to have Industry Experience: Manufacturing & supply chain planning projects Experience

Preferred Project Experience: 2 Full life cycle implementation in SAP PI/PO and 2 full life cycle implementations in ABAP (At least one in S4 HANA)

Role & Responsibilities:

·         Define Integration Strategy and design principles for ERP Integration landscape

·         Responsible for the Technical delivery of the new interfaces and overall quality

·         Demonstrate extensive experience operating as a SAP PI/PO Integration Consultant

·         Provide evidence of at least 2 full life cycle implementations in ABAP (At least one in S4 HANA) and 2 full life cycle implementation in SAP PI/PO.

·         Extensive Integration architecture experience in SAP XI/PI/PO, full life cycle implementation in SAP XI/PI/PO (7.31 onwards)

·         Extensive knowledge in ABAP developments around various processes/functionalities with knowledge in the area of SD/MM/SCM/PP

·         Excellent programming skill and well versed with S4 coding standard/methods

·         Must have extensive experience of working in various SAP specific integration technologies (ABAP Proxy, BAP/RFC/IDOC/ Web Services/ IDOC enhancements)

·         In - depth knowledge in SAP User exits (BADIs / BAPIs) and ABAP programing , as well as ALE , EDI technologies

·         Experience in Enhancement Framework, Object Oriented programming within SAP

·         Any experience on SAP AIF and Webmethods would be added advantage

Nos of Resources required: 1 to 2

Work location: Mumbai/Bangalore

Qualification: BE, BTech, MBA, MCA Prefer Comp. Sci. background

Experience: 4 yrs - 6 yrs

Mobilization Period in weeks: 3 to 4

Duration: 3-4 quarters.

Fixed Price - Est. Time: 6 months,

Position: PySpark Developer

Must Have Technical Skills: PySpark

Good to have Technical Skills: Spark, DWH

·         Hand on experience in Pyspark and Numpy, List, Pandas, Graph plotter.

·         Excellent knowledge of Python framework and Airflow/Luigi/Dagster/Apache Beam.

·         Python skills with all sorts of data libraries, visualization libraries,

·         Good SQL knowledge.

·         Ability to perform multiple task in continually changing environment.

·         Exposure on Hive/HBase/Redis.

·         Good knowledge of Data warehousing.

·         Reading from Parquet and other format.

·         Worked on Hive and HBase.

 

Nos of Resources required: 3 to 4

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months  

Fixed Price - Est. Time: 12 months,

Position: Azure Snowflake

  • Demonstrated ability to have successfully completed multiple, complex technical projects and create high-level design and architecture of the solution, including class, sequence and deployment infrastructure diagrams.
  • Take ownership of technical solutions from design and architecture perspective for projects in presales phase as well as on-going projects.
  • Prior experience with application delivery using an Onshore/Offshore model
  • Experience with gathering end user requirements and writing technical documentation
  • Time management and multitasking skills to effectively meet deadlines under time-to-market pressure
  • Suggest innovative solutions based on new technologies and latest trends to sales team.
  • Review the architectural/ technological solutions for ongoing projects and ensure right choice of solution.
  • Work closely with sales team and clients to understand their business, capture requirements, identify pain areas, accordingly, propose an ideal solution and win business.

Azure :-Hands on experience in ADF – Azure Data Factory

  • Hands on experience in Big Data & Hadoop ECO Systems
  • Exposure to Azure Service categories like PaaS components and IaaS subscriptions
  • Ability to Design, Develop ingestion & processing frame work for ETL applications
  • Hands on experience in powershell scripting, deployment on Azure
  • Experience in performance tuning and memory configuration
  • Should be adaptable to learn & work on new technologies
  • Should have Good written and spoken communication skill

Nos of Resources required: 1 to 2

Work location: As of now Remote (Bangalore)

Experience: 5 yrs – 6 yrs

Mobilization Period in weeks: 2 weeks

Duration: 6 to 12 months 

Fixed Price - Est. Time: 12 months,

SAS Data Integration Consultant:

  • Graduate or Post graduate in Engineering or a Post graduate in non-engineering disciplines
  • Certification in Base SAS/ Advanced SAS will be an added advantage
  • SAS Base
  • SAS DI
  • Good basic knowledge in Base and Advance SAS
  • No of position-3
  • Role & Responsibilities - Dev and Testing Skills
  • Experience - 3-6 years of experience
  • Location – Delhi NCR
Fixed Price - Est. Time: 12 months,

Position: Support Engineer

Must Have Technical Skills: Azure Data Factory

Good to have Technical Skills: Knowledge of sql or hive, python, pyspark, hdfs or adls

Preferred Industry Experience: Manufacturing

 Role:

Monitor Batch Pipelines in the Data Analytical Platform and provide workaround to the problems.

Troubleshoot issues, provide workaround, determine root cause, upon failures.

Desired Qualification:

·         0-3 years of DE or batch support experience.

·         Must be willing to work completely in the night shift.

·         Ability to work on a task independently or with minimal supervision.

·         Knowledge of Data Orchestration Tool, preferably ADF.

·         Knowledge of sql or hive, python, pyspark, hdfs or adls.

·         Work location: Remotely.

Nos of Resources required: 1 to 2

Work location: Remote

Qualification: BE, BTech, MBA, MCA Prefer Comp. Sci. background

Experience: 0-3 Years

Mobilization Period in weeks: 1 week

Duration: 6 to 12 months  

Fixed Price - Est. Time: 6 months,

Position: Hadoop Admin

Must Have Technical Skills: Hadoop Admin

Good to have Technical Skills: Linux Admin, ETL

·         Extensive experience with RedHat Linux and Cloudera is mandatory.

·         Experience in installing, configuring, upgrading and managing Hadoop environment.

·         Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.

·         Work closely with data scientists and data engineers to ensure the smooth operation of the platform.

·         End-to-end performance tuning of the clusters.

·         Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.

·         Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.

·         Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.

·         Knowledge of Private and Public cloud computing and virtualization platform.

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months