Job Description
Job Description
Location : Bay Area candidates are also welcome About Eidon AI
We are a team of AI & Crypto researchers and engineers (coming from Google DeepMind, top10 Hedge Funds, and pioneering AI / Crypto companies) with the long-term mission of co-creating the full decentralized AI stack in a trustless and permissionless way, with fair rewards for participation and contribution, that advances and benefits all of humanity.
Our first focus is on decentralising AI Data Availability and Data Restaking by connecting AI data publishers with data consumers in a decentralized network and marketplace.
Our team is small, operating in a flat structure, highly motivated, and focused on engineering excellence. Eidon is a place for those who like to take ownership, have strong curiosity & do-it energy, appreciate challenging themselves and are ready to work in the intersection of multiple knowledge areas.
All engineers and researchers share the title "Member of the Technical Guild."
All employees are expected to be hands-on and to contribute directly to the mission of advancing decentralised AI. Leadership is given to those who show initiative and consistently deliver excellence.
Work ethic and strong prioritisation skills are important. We don't spend time with powerpoints and sitting long in meeting rooms.
All engineers and researchers are expected to have strong skills of listening and communicating with clarity and respect.
Sharing knowledge openly, concisely and accurately with teammates.
Eidon AI does not have recruiters, PMs or other people / policy / patrolling staff. Every application is reviewed directly by a technical member of the team.
We are funded by top tier VCs from the Valley.
Job Description
Eidon AI is at the forefront of revolutionising the AI data ecosystem through decentralisation. We're seeking Data Engineers who are passionate about building scalable data pipelines and storage solutions in a decentralised environment.
You will play a critical role in enabling seamless data flow between AI data publishers and consumers, ensuring integrity, availability, and fairness in data rewards.
Responsibilities
Design, build, and maintain efficient, reliable, and scalable data pipelines and storage solutions within a decentralised framework.
- Work closely with AI Engineers and Protocol Engineers to ensure seamless data integration and accessibility.
- Collaborate with stakeholders to understand data needs, implement quick prototypes to serve AI data demand pilot projects, create tools and solutions that can scale and grow into a decentralised AI Data protocol
- Stay abreast of emerging blockchain and data technologies to continually enhance the protocol's data infrastructure.
Requirements
- Engineering background (e.g., CS, CE, or EE degree or equivalent experience).
- Proven experience in data engineering, with a solid understanding of data structures, algorithms, and software engineering principles.
- Strong expertise in programming languages such as Python, Go, Rust, Spark and experience with blockchain technologies is highly desirable.
- Familiarity with decentralised storage solutions (e.g., IPFS, Filecoin) and data pipeline tools, AI / ML Data transformation and pipeline technologies
- Building data processing libraries from scratch
- Designing and implementing distributed systems
- Keeping up with state-of-the-art techniques for preparing AI data