What we expect from applicants:
- Experience in commercial development in Python from 3 years (OOP, multithreading), ideally – experience in developing a system that works under high load / with big data;
- Practical experience in Django from 1 year;
- Knowledge of SQL, including query optimization and database configuration;
- Experience ( understanding) with NoSQL databases;
- Understanding of SOLID, DDD, knowledge of common design patterns;
- Ability to work confidently in Linux;
- Basic English: reading technical documentation, written communication with other developers.
- Experience with PostgreSQL, Redis, Amazon Web Services, Docker, Kubernetes.
- Experience with cloud services and development for them;
- Experience in developing REST services and understanding how they work;
- Practical experience in asyncio/aiohttp/sql alchemy;
- Ability to work in a Continuous Integration environment.
All our tasks are based on the fact that:
- We collect data from third-party APIs
- We conduct QA of this data
- visualize and view this data
- Let the client access the data by uploading it to different sources (BigQuery, Redshift, Amazon S3) or by providing direct access to the Improvado storage
All this must be done reliably so that every day hundreds of gigabytes are processed and nothing is falling apart.
An example of one of our services is a service for uploading a significant amount of data from Clickhouse to external sources, which takes into account changes in the client’s data schema, keeps track of what new data has appeared, etc.