The computer vision machine learning tools of yesterday were fragmented, incomplete, and inaccessible to nearly all developers. Standard Cyborg is founded on the belief that computer vision should be accessible to all and that there is immense potential to be unlocked by bridging the gap between bits and atoms.
Hackers, students, startups and large companies use our tools to build, test, and deploy computer vision machine learning applications without having to chain together a series of incompatible data types, and products.
Simply put, we solve some of the hardest and most central problems in computer vision infrastructure so our customers can build amazing things faster.
Our impact: - We're empowering the next generation of spatially-aware products and applications. Hackers, students, startups and large companies use our tools to design, test, run and scale computer vision applications.
Our technology: - We're working on really hard technology, usually involving math. Our team are experts in graphics, geometry, vision, and machine learning. - We work on tools as well as solutions. We eat our own dog food which we think makes our tools even better.
Our team: - We're a team of ambitious and humble engineers who, first and foremost, care about solving real problems shipping great products that we're proud to put our names on. - We have deep expertise in graphics, geometry, computer vision, and machine learning. We love solving hard problems and want to push the boundaries of what's technically possible.
We run a weekly development cycle but many of our projects are more unbounded, hard problems so we're not strictly agile.
We start the week on Monday to review what we got done last week and review what we want to get done this week in order to make sure we're on track to hitting our team-committed timelines and goals.
We use Github and Github PRs to review code. Designs/Specs can be generated where useful, but we don't yet strictly enforce them. We optimize for speed and data model correctness. All our engineers have a large degree of autonomy in accomplishing their goals. There is a lot of collaboration and sanity checking conversation that happens in Slack and helps keep everyone on the same page.
- Real-time RGB-D SLAM for the TrueDepth sensor
- C++ SDK for handling 2D and 3D data structure, IO, and relevant algorithms
- Emscripten WebGL viewer for viewing and editing scenes
- Novel CoreML architectures for 2D landmarking on mobile
- Novel 3D point cloud semantic segmentation networks
Our customers need to use ML training, inference, as well as run conventional CV programs (backed by our SDK) as-a-service in the cloud. These jobs will be trigged manually via the web dashboard, through our API, or through events and hooks in our web platform. (eg a webhook could trigger a validation job, or onCreate could trigger a ML classification). You'd build this service from the ground up.
Standard Cyborg is a close-knit team motivated by curiosity and a passion for meeting meaningful, real-world needs. Our team is original, humble, enthusiastic, and ready to spend the next several years advancing the state of the art in Computer Vision ML. We're looking for others that deeply value these traits and see themselves actively contributing to building this culture.
We value originality in thought and humility in our convictions.
We’re endlessly curious and tenacious in our effort to make the complex field of computer vision approachable, humane, and easy to deploy for our customers.
We believe that great product design with high quality tooling, documentation, and examples are the best way to achieve that.
We are pragmatic, not dogmatic. No ideas are taken personally, and all ideas are open to discussion.
We offer unlimited vacation. And we want you to use it!
Monthly Education/Learning stipend to use for personal growth!
lunch and learns, alternating team happy hours, quarterly offsites, weekly demo show-offs and research paper overviews
Interested in this company?
Skip straight to final-round interviews by applying through Triplebyte.