230 Cloud Hadoop Jobs in Bangalore - page 2
-
GCP Technical Architect Impetus - Bengaluru, Karnataka, India including knowledge of shell/Python scripting for automating common task. Roles & Responsibility: • Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Cloud 6 days ago
-
Senior Data Engineer - Hadoop/Scala/Spark Zype - Bangalore, India technologies such as Hadoop, Spark, or Kafka.- Strong SQL skills and experience with relational databases (e.g., MySQL, PostgreSQL).- Experience with cloud platforms such as AWS, GCP, or Azure.- Excellent 25 6 days ago
-
Senior Cloud Infrastructure Manager Securonix - Bengaluru, Karnataka, India and leverages the power of the Snowflake Data Cloud. Job Title: Senior Cloud Infrastructure Manager Job Level: Manager Total Experience: 14+ years Relevant Experience: 8-10 years Primary Skills: Manager, AWS 9 days ago
-
Big Data Specialist Talent21 Management and Shared Services Pvt Ltd - Bengaluru, Karnataka, India Redshift is must and Snowflake is a plus Knowledge of Hadoop and or Big Data cloud services such AWS EMR or Cloudera Ability to diagnose problems and resolve issues across various tiers (application 12 days ago
-
Data Architect Alstom - Bengaluru, Karnataka, India science, video analysis, sensor design, roaming data capture, etc. and building its digital assets. Alstom has developed Mobility Data Platform, aiming at designing, building and operating a multi-tenant 8 days ago
-
Member of Technical Staff - Distributed Systems Andromeda Security - Bengaluru, Karnataka, India in the technologies below is a strong plus : Timescale DB, Cassandra, RDS, Neptune, Neo4j, Redis Gremlin or openCypher GraphQL Big data processing systems - Kafka, Storm, Spark, Hadoop Cloud Computing Architectures 12 days ago
-
Associate Technical Architect - Data Quantiphi - Bengaluru, Karnataka, India ownership of the entire delivery of the project Designing and documentation of the solution Knowledge of RDBMS & NoSQL databases o Any of Kafka, Kinesis, Cloud pub-sub Cloud Platforms – AWS, GCP or Azure Big 11 days ago
-
Data Engineer - Synapse Analytics Peepal Technology and Management Consultant - Bangalore, India , Github, Gitlab, Azure DevOps - Exposure on using CD tools like Jenkins, Bamboo, Azure DevOps - Cloud exposure (Hadoop)- - Exposure of working on Power BI- - Prior experience in building or working in team 12 10 days ago
-
Technical Architect Impetus - Bengaluru, Karnataka, India , Dataflow, pub/sub, Cloud functions, Cloud composer, Big Query, Big Table, - At least 4 of these Services. Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including DevOPs. Good 12 days ago
-
Data Engineer Lineaje Inc - Bengaluru, Karnataka, India cloud vendors: AWS is a must. • Experience working with distributed processing frameworks such as Spark, Hadoop, Hive, or other Apache Foundation frameworks. • Deep working knowledge in node.js language 19 days ago
Top locations