- Industry:Information Technology
- Category:Software Development
- Minimum Education: Bachelors
- Skills:Time Management
- Job LocationKathmandu
- Posted on:November 16,2021
- Apply Before:November 30,2021
At CloudFactory we believe that talent is equally distributed around the world, but opportunity is not. We also believe that the future of work is distributed and on-demand. We are leveling the playing field with technology that effectively shrinks the distance between companies and the world’s massive untapped talent pool. CloudFactory helps companies grow by connecting them to a scalable WorkStream, staffed by a global workforce, that is tightly integrated with their teams, processes and systems. CloudFactory helps companies scale by deeply integrating a global workforce into their teams, processes and systems. By harnessing the massive, untapped talent pool that exists in every corner of our planet, we will be able to maximize the potential of people, companies and ideas — regardless of where they originate.
As a member of CloudFactory’s engineering team, you will be tasked with building and growing a world-class distributed workforce management platform that will connect one million people in developing countries to basic computer work.
Roles and Responsibilities
- Help to design and develop a data estate that is performant, accessible, secure, scalable, maintainable, and extensible.
- Help to implement true CI/CD using Github Actions, etc.
- Design and develop EDW using Snowflake, DBT.
- Design and develop AWS Data Lake using S3, Athena, Snowflake
- Design and develop data ingestion pipelines using SnowPipe, FiveTran, etc.
- Model EDW entities and ensure all data is complete, accurate, timely, and documented within Data Dictionary and Ubiquitous Terms documents.
- Work towards the implementation of a true Self-Service BI platform.
- Implement good practices to ensure that Git is used in all circumstances, This includes reviewing others’ work and having your work reviewed and approved by colleagues.
Governance & Data Protection
- Ensure that all work follows best security practices and fully adheres to GDPR, PECR, and other data regulations.
- Ensure that all work follows the correct approval and sign-off process before it is pushed into Production.
- Ensure that all work is documented and if needed - has a runbook in order to allow the business to continually support it.
- Where required, ensure that a PIA (Privacy Impact Assessment) is completed.
- Work with others in the team to keep the Data Dictionary and Ubiquitous Language complete and up to date.
- Follow existing processes and work to improve/identify gaps in these processes.
- Ensure the correct SDLC promotion processes are followed.
- Follow the correct sign-off processes to ensure that only approved releases are deployed into Production.
- Ensure that all AWS development follows CI/CD processes and is repeatable.
- Ensure that the AWS “Well-Architected Framework” is adhered to. See https://aws.amazon.com/architecture/well-architected/?wa-lens-whitepapers.sort-by=item.additionalFields.sortDate&wa-lens-whitepapers.sort-order=desc
- Evangelise about the Data Team and estate across the business.
- Build relationships with members of the Data Team and the wider Engineering Team.
- Work closely and collaboratively with all members of the Data Team and wider Engineering Team.
- Work closely with and learn from tech and team leads and challenge proposed solutions with your own ideas.
- Ability to work across global teams and working with different cultures across different time zones with strong communication and collaboration skills
- Tendency to go above and beyond to make things work; manage own and others work to meet the deadline and assist other team members in their deliverables
- Ability to identify solutions and make the complex simple.
- Ability to breakdown complex problems into simple solutions
- Some experience and knowledge of a coding language such as Python.
- Good experience and knowledge of the SQL query language.
- Some understanding of star schemas and data warehouse concepts..
- Some knowledge of AWS tools and technologies (i.e. Lambda, S3, SQS, SNS, DDB, RDS etc).
- Beneficial - ETL and ELT experience - both batch and microservices-led.
- Beneficial - Some Snowflake experience.
- Snowflake (including SnowPipe, streams, security, integrations etc)
- AWS Data Lake (SNS, SQS, S3, Glue, Athena, DynamoDB etc)
- AWS Data Streams (Kinesis, Elastic Search, LogStash, Lambda, API-G etc)
- Lunch & Snacks Provided Monday-Friday
- Phone Allowance and/or Internet Allowance
- Travel Allowance
- Social Security Fund
- Festival Bonus
- Health Spending Account
- Medical Insurance
- Accidental Insurance
- Amazing Company Mission and Culture
- Growth Opportunities