Big Data Hadoop Jobs in Gurgaon District

1 to 10 of 49 vacancies

Sort by:  Date | Relevance

Vice President of Technology  

BluePi - Gurugram, Haryana, India

, MicroStrategy, or Cognos Machine learning: TensorFlow, PyTorch, or Scikit-learn Artificial intelligence: AWS SageMaker, Azure ML, or GCP AI Platform Big data: Hadoop, Spark, or Kafka Programming languages: Python

from: linkedin.com - 5 days ago

Senior Data Solutions Architect  

Material - Gurugram, Haryana, India

Expertise on designing and building new Cloud Data platform and its optimization at organization level. Hands-on experience in Big Data technologies - Hadoop, Sqoop, Hive and Spark including DevOps. Details

from: linkedin.com - 7 days ago

Senior Data Engineer - Big Data/ETL  

Optivalue tek consulting - Gurgaon/Gurugram, India

Warehousing and Data Analytics skills.- Proficiency in SQL and database management.- Experience with big data technologies such as Hadoop, Spark, or Kafka.- Strong problem-solving and analytical skills 25

from: hirist.com - Yesterday

GCP Data Engineer  

Persistent Systems - Gurugram, Haryana, India

, Scala, Hadoop, HDFS, YARN, Hive, GCP Cloud, Google Cloud Dataflow, Big Query, Dataproc, Elasticsearch, MySQL, Oracle, Cloudera, Azure HDInsight, AWS EMR, Python, Spring Boot, Spring Data, REST APIs

from: linkedin.com - 7 days ago

Data Engineer  

Impetus - Gurugram, Haryana, India

in Big Data technologies – Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage

from: linkedin.com - 20 days ago

Senior Data Scientist - SQL/Python  

Bigfoot Retail Solutions Pvt. Ltd - Gurgaon/Gurugram, India

with machine learning frameworks (e.g., TensorFlow,PyTorch, scikit-learn).- Experience with big data technologies (e.g., Hadoop, Spark) is a plus.- Experience with AWS and cloud-based data solutions 20

from: hirist.com - 3 days ago

Job Application-Qlik Lead-Immediate Joiner  

EXL - Gurugram, Haryana, India

having good understanding of Data Models like Snowflake Schema and Star Schema and conceptual understanding of Data Warehouse, Big Data, Dimensional Modeling, Hadoop Exposure of working in Agile framework

from: linkedin.com - 20 days ago

Data Engineer  

Impetus - Gurugram, Haryana, India

, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies – Hadoop, Sqoop

from: linkedin.com - 25 days ago

Senior Data Engineer  

Capgemini - Gurgaon, HR, IN

with unstructured datasets. Proficiency in Spark Technology and Elastic pipeline engine. Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases. Experience

from: Capgemini - 8 days ago

GCP data engineer  

Impetus - Gurugram, Haryana, India

background in Data engineering 3-6 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. Exposure to enterprise application development is a must. Roles & Responsibilities Able

from: linkedin.com - More than 30 days ago


Get email alerts for: Big Data Hadoop Jobs in Gurgaon District

Top locations

Loading map...

Top locations in Gurgaon District hiring now Big Data Hadoop Jobs:

Login through

JobisJob Tutorial

Visit our Help Centre

JobisJob Forum

Start a Discussion and Share your Insights.

Join

Share Options

Filter jobs by

Related sections

My recent searches
You have no recent searches at the moment.
Use our Job Search to find your new job.

Refine your search