About Us AlphaPoint’s AI Labs’ team of engineers and AI scientists is solving complex business problems by bridging the gap between transformative breakthroughs in AI technology and increasingly competitive markets.
Our team is developing and applying the latest generative AI, data and knowledge modeling technologies to large scale problems, right at the edge of what is possible.
AlphaPoint is a financial technology company powering digital asset exchanges and brokerages worldwide. The Role Build a scalable and highly performant infrastructure to process batch and real-time workloads Work with the AI engineering team and external engineering teams to monitor and extract data from a vast array of data sources Implement ETL data pipelines Architect backend data solutions to support various microservices Develop third-party integrations with large-scale legacy systems You Bachelor’s degree in computer science or similar discipline 7-10 years experience in software development Proficient in Python, Node.js , and / or Java Familiarity with the basic principles of distributed computing and data modeling Experience building ETL pipelines using Apache Airflow and Spark, Databricks, or other pipeline orchestration tools Experience with NoSQL databases such as MongoDB, Cassandra, DynamoDB, or CosmosDB Experience with real-time stream processing systems like Kafka, AWS Kinesis, GCP Data Flow Experience with Redis, Elasticsearch, Solr Experience with messaging systems like RabbitMQ, AWS SQS, GCP Cloud Tasks Ability to find creative ways to harvest data in unstructured formats by scraping, modeling, and ingesting data into semantic databases and graphs Familiarity with Delta Lake and Parquet files Familiarity with one or more cloud providers :
Cloud Data Engineer • Austin, TX, US