- Industry:Information Technology
- Category:Software Development
- Minimum Education: Bachelor's degree in Computer Science or related
- Skills:Experience with AWS/GCP Services (EMR, Redshift, Google Data Studio, BigQuery)
- Job LocationKathmandu
- Posted on:June 08,2021
- Apply Before:June 15,2021
- Experience in any programming language like Java, Python, Scala.
- Experience in creating ETL pipelines and familiar with extraction, transformation, loading, filtering, cleaning, joining, scheduling, monitoring, and data-streaming.
- Experience with data processing tools. (Spark, Hadoop)
- Experience with AWS/GCP Services (EMR, Redshift, Google Data Studio, BigQuery)
- Familiarity with Data warehousing tools and processes. (Snowflake, RedShift, S3, BigQuery)
- Experience in setting up ingestion pipelines. (Apache Kafka, Amazon Kinesis)
- Familiarity with analytics and visualization tools is preferable.
- Candidates with certifications in big data tools would be preferable.
- Experience with relational SQL and NoSQL databases.
- Familiarity with project management processes (Sprint, KANBAN) and tools. (Jira, Asana)
- Ability to work independently or in a collaborative environment with a proactive attitude.