Big Data Hadoop Jobs in Gurgaon District
BluePi - Gurugram, Haryana, India
, MicroStrategy, or Cognos Machine learning: TensorFlow, PyTorch, or Scikit-learn Artificial intelligence: AWS SageMaker, Azure ML, or GCP AI Platform Big data: Hadoop, Spark, or Kafka Programming languages: Python
from: linkedin.com - 5 days ago
Material - Gurugram, Haryana, India
Expertise on designing and building new Cloud Data platform and its optimization at organization level. Hands-on experience in Big Data technologies - Hadoop, Sqoop, Hive and Spark including DevOps. Details
from: linkedin.com - 7 days ago
Optivalue tek consulting - Gurgaon/Gurugram, India
Warehousing and Data Analytics skills.- Proficiency in SQL and database management.- Experience with big data technologies such as Hadoop, Spark, or Kafka.- Strong problem-solving and analytical skills 25
from: hirist.com - Yesterday
Persistent Systems - Gurugram, Haryana, India
, Scala, Hadoop, HDFS, YARN, Hive, GCP Cloud, Google Cloud Dataflow, Big Query, Dataproc, Elasticsearch, MySQL, Oracle, Cloudera, Azure HDInsight, AWS EMR, Python, Spring Boot, Spring Data, REST APIs
from: linkedin.com - 7 days ago
Impetus - Gurugram, Haryana, India
in Big Data technologies – Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage
from: linkedin.com - 20 days ago
Bigfoot Retail Solutions Pvt. Ltd - Gurgaon/Gurugram, India
with machine learning frameworks (e.g., TensorFlow,PyTorch, scikit-learn).- Experience with big data technologies (e.g., Hadoop, Spark) is a plus.- Experience with AWS and cloud-based data solutions 20
from: hirist.com - 3 days ago
EXL - Gurugram, Haryana, India
having good understanding of Data Models like Snowflake Schema and Star Schema and conceptual understanding of Data Warehouse, Big Data, Dimensional Modeling, Hadoop Exposure of working in Agile framework
from: linkedin.com - 20 days ago
Impetus - Gurugram, Haryana, India
, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies – Hadoop, Sqoop
from: linkedin.com - 25 days ago
Capgemini - Gurgaon, HR, IN
with unstructured datasets. Proficiency in Spark Technology and Elastic pipeline engine. Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases. Experience
from: Capgemini - 8 days ago
Impetus - Gurugram, Haryana, India
background in Data engineering 3-6 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. Exposure to enterprise application development is a must. Roles & Responsibilities Able
from: linkedin.com - More than 30 days ago
Get email alerts for: Big Data Hadoop Jobs in Gurgaon District
Sponsored Ads by Indeed