Master Data and Information Management
Master Data and Information Management
  • Data Lakes

    The growing role of big data and analytics places a certain challenge for enterprises. The exponential growth of information from diverse sources in various formats and quality presents a problem in the management of information and its utilization, and data warehousing itself.

    With the increasing number of customer touch-points and a myriad of new sources of business data, data migration from traditional sources, Web Logs (unstructured data) and Data Warehouses to Hadoop clusters. Setting up Data Lakes is necessitated to leverage Data generated from legacy Transactional systems as well as Digital data from social media in real-time or near real-time.

    Our experts can help you to:
    • Create a reservoir for enterprise, social, and device information
    • Develop a scalable storage, compute, and access data layer
    • Build the agility of absorbing data of any type/scale and processing that data for different business needs
    • Enable data governance and enterprise-wide access controls
    • Reduce the time to access and locate data to accelerate preparation and reuse
    • Convert digital data noise into inspired insights
    • Manage you Master Data Management and Data Integration requirements effectively
    • Help you build crucial and customized data assets to achieve smarter, faster and effective decision making
    • Data Architecture and Engineering
    • Data Quality Management
Master Data and Information Management Expert
Related Work
Fixed Price - Est. Time: 12 months,

Position: Support Engineer

Must Have Technical Skills: Azure Data Factory

Good to have Technical Skills: Knowledge of sql or hive, python, pyspark, hdfs or adls

Preferred Industry Experience: Manufacturing

 Role:

Monitor Batch Pipelines in the Data Analytical Platform and provide workaround to the problems.

Troubleshoot issues, provide workaround, determine root cause, upon failures.

Desired Qualification:

·         0-3 years of DE or batch support experience.

·         Must be willing to work completely in the night shift.

·         Ability to work on a task independently or with minimal supervision.

·         Knowledge of Data Orchestration Tool, preferably ADF.

·         Knowledge of sql or hive, python, pyspark, hdfs or adls.

·         Work location: Remotely.

Nos of Resources required: 1 to 2

Work location: Remote

Qualification: BE, BTech, MBA, MCA Prefer Comp. Sci. background

Experience: 0-3 Years

Mobilization Period in weeks: 1 week

Duration: 6 to 12 months  

Fixed Price - Est. Time: 3 months,

Tableau Developers:

Tableau Developer Responsibilities:

  • Developing, maintaining, and managing advanced reporting, analytics, dashboards and other BI solutions.
  • Performing and documenting data analysis, data validation, and data mapping/design.
  • Reviewing and improving existing systems and collaborating with teams to integrate new systems.
  • Conducting unit tests and developing database queries to analyze the effects and troubleshoot any issues.
  • Creating tools to store data within the organization.

Tableau Developer Requirements:

  • Degree in Mathematics, Computer Science, Information Systems, or related field.
  • Relevant work experience.
  • A solid understanding of SQL, rational databases, and normalization.
  • Proficiency in use of query and reporting analysis tools.
  • Competency in Excel (macros, pivot tables, etc.)
  • Extensive experience in developing, maintaining and managing Tableau driven dashboards & analytics and working knowledge of Tableau administration/architecture.

Experience:

  • 2 to 3 years

Job location:

  • Bangalore/Chennai/ Hyderabad (WFH at the moment)

Duration:

  • 3 months. May get extended.

Joining:

  • immediate
Fixed Price - Est. Time: 12 months,

SAS Data Integration Consultant:

  • Graduate or Post graduate in Engineering or a Post graduate in non-engineering disciplines
  • Certification in Base SAS/ Advanced SAS will be an added advantage
  • SAS Base
  • SAS DI
  • Good basic knowledge in Base and Advance SAS
  • No of position-3
  • Role & Responsibilities - Dev and Testing Skills
  • Experience - 3-6 years of experience
  • Location – Delhi NCR
Fixed Price - Est. Time: 6 months,

Position: PySpark Developer

Must Have Technical Skills: PySpark

Good to have Technical Skills: Spark, DWH

·         Hand on experience in Pyspark and Numpy, List, Pandas, Graph plotter.

·         Excellent knowledge of Python framework and Airflow/Luigi/Dagster/Apache Beam.

·         Python skills with all sorts of data libraries, visualization libraries,

·         Good SQL knowledge.

·         Ability to perform multiple task in continually changing environment.

·         Exposure on Hive/HBase/Redis.

·         Good knowledge of Data warehousing.

·         Reading from Parquet and other format.

·         Worked on Hive and HBase.

 

Nos of Resources required: 3 to 4

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months  

Fixed Price - Est. Time: 12 months,

OMP Delivery Manager

Must have Technical Skills : Functional experience in Demand Planning, Supply Planning, Scheduling, MEIO, S&OP solutions

Good to have Technical Skills : experience in implementing at least Two OMP engagement

Preferred Industry Experience: supply chain planning projects Experience

Preferred Project Experience: Involved in 2 OMP implementation as functional consultant

 

Role & Responsibilities:

Depth and Breadth in OMP solutions and must have prior experience in implementing at least Two OMP engagement with hands-on skills.

Technical hands on experience with OMP Demand Planning, Supply Planning, and Production Planning solutions.

Experience as an Architect and participating in design sessions, resolving technical issues during various project phases.

Proven experience conducting analysis sessions to understand the functional and technical needs of project requirements and the developed technical solutions to meet those needs.

Experience facilitating/leading functional / technical analysis and/or design workshops in support of desired to-be business requirements and then transferring the information gained into a written design document for group review and consensus.

Knowledge on integrating OMP solutions with SAP ERP.

Nos of Resources required: 1 to 2

Work location: Mumbai/Bangalore

Qualification: BE, BTech, MBA, MCA Prefer Comp. Sci. background

Experience: 8 yrs - 12 yrs

Mobilization Period in weeks: 3 to 4

Duration: 3-4 quarters.

Fixed Price - Est. Time: 6 months,

Position: NiFi /Big Data Developer 

Must Have Technical Skills: NiFi

Good to have Technical Skills: Python, ETL

Preferred Industry Experience: Telecom

·         Extensive experience on Nifi to setup data pipeline.

·         Hands on experience in using controllers and processors to setup ETL framework in Apache Nifi

·         Extensive experience in Python

·         Food understanding on Spark, Spark Streaming & PySpark.

·         Good understanding of Big Data components

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months  

Fixed Price - Est. Time: 12 months,

Position: Azure Pyspark

Pyspark + with Data warehousing and Azure will be add on.

·         Must have low level design and development skills.  Should able to design a solution for a given use cases. 

·         Agile delivery- Person must able to show design and code on daily basis

·         Must be an experienced PySpark developer and Scala coding.   Primary skill is PySpark

·         Must have experience in designing job orchestration, sequence, metadata design, Audit trail, dynamic parameter passing and error/exception handling

·         Good experience with unit, integration and UAT support

·         Able to design and code reusable components and functions

·         Should able to review design, code & provide review comments with justification

·         Zeal to learn new tool/technologies and adoption

·         Good to have experience with Devops and CICD

Nos of Resources required: 1 to 2

Work location: Bangalore

Experience: 8 yrs – 9 yrs

Mobilization Period in weeks: 2 weeks

Fixed Price - Est. Time: 12 months,

SAS Data Quality Consultant:

  • Data Quality developer will be responsible for analyzing and understanding data sources and end-user requirements using the SAS DQ.
  • Must be aware of Data Quality dimensions and their implementation in SAS DQ tool.
  • No of Position-2
  • Role & Responsibilities - Dev and Testing Skills
  • Experience - 3-6 years of experience
  • Location – Delhi NCR
Fixed Price - Est. Time: 12 months,

Position: Azure Snowflake

  • Demonstrated ability to have successfully completed multiple, complex technical projects and create high-level design and architecture of the solution, including class, sequence and deployment infrastructure diagrams.
  • Take ownership of technical solutions from design and architecture perspective for projects in presales phase as well as on-going projects.
  • Prior experience with application delivery using an Onshore/Offshore model
  • Experience with gathering end user requirements and writing technical documentation
  • Time management and multitasking skills to effectively meet deadlines under time-to-market pressure
  • Suggest innovative solutions based on new technologies and latest trends to sales team.
  • Review the architectural/ technological solutions for ongoing projects and ensure right choice of solution.
  • Work closely with sales team and clients to understand their business, capture requirements, identify pain areas, accordingly, propose an ideal solution and win business.

Azure :-Hands on experience in ADF – Azure Data Factory

  • Hands on experience in Big Data & Hadoop ECO Systems
  • Exposure to Azure Service categories like PaaS components and IaaS subscriptions
  • Ability to Design, Develop ingestion & processing frame work for ETL applications
  • Hands on experience in powershell scripting, deployment on Azure
  • Experience in performance tuning and memory configuration
  • Should be adaptable to learn & work on new technologies
  • Should have Good written and spoken communication skill

Nos of Resources required: 1 to 2

Work location: As of now Remote (Bangalore)

Experience: 5 yrs – 6 yrs

Mobilization Period in weeks: 2 weeks

Duration: 6 to 12 months 

Fixed Price - Est. Time: 6 months,

Position: Hadoop Admin

Must Have Technical Skills: Hadoop Admin

Good to have Technical Skills: Linux Admin, ETL

·         Extensive experience with RedHat Linux and Cloudera is mandatory.

·         Experience in installing, configuring, upgrading and managing Hadoop environment.

·         Responsible for deployments, and monitor for capacity, performance, and/or troubleshooting issues.

·         Work closely with data scientists and data engineers to ensure the smooth operation of the platform.

·         End-to-end performance tuning of the clusters.

·         Maintains and administers computing environments including computer hardware, systems software, applications software, and all configurations.

·         Defines procedures for monitoring and evaluates, diagnoses and establishes work plan to resolve system issues.

·         Working knowledge of entire Hadoop ecosystem like HDFS, Hive, Yarn, Oozie, Kafka, Impala, Hive, Kudu, HBase, Spark and Spark Streaming.

·         Knowledge of Private and Public cloud computing and virtualization platform.

Nos of Resources required: 2 to 3

Work location: Remote

Qualification: BTech

Experience: 4 yrs – 5 yrs

Mobilization Period in weeks: 1 week

Duration: 3 to 6 months