Our mission is to build robots that save people time and energy. We believe that we can use computer vision to create truly autonomous machines that work for us.
Technology is humanity’s greatest enabler, and its purpose is to make our lives easier — to give us back our time and energy. We are a small team laser-focused on this mission, and these are the principles that guide us.
*Solving real problems. Central to our mission is a desire to solve real, existing problems by shipping a product that people will find useful. While others focus on problems that may seem cool at the moment, we want to build products that people cannot live without.
*Perception through computer vision and machine learning We are on a mission to teach robots to see the world the way we do. We want to develop algorithms and scale them into a product people can use every day.
*Integrating hardware and software To achieve great user experience, we need to design hardware and software together so that they are optimized for each other. We wholeheartedly agree with Alan Kay when he said, “People who are really serious about software should make their own hardware.”
If you’re ambitious and would love to challenge the status quo, come join us. This is definitely not for a faint-hearted.
Why join us?
Navneet and Mehul are repeat founders with 2 exits and multiple successful products under their belts. Navneet also invented HOG the foundational technique of classical Computer Vision. His PhD thesis now has 27K+ citations.
We are a mission-driven and tight-knit group, working in a fast-paced environment with high autonomy and ownership. We highly value learning and curiosity, and we have a high-risk culture where we seek 10x solutions with the understanding that we may not always succeed.
Everyone is directly responsible for a key component of the product and have end-to-end ownership. Still, as a small team, everyone is tasked with multiple roles from research to prototyping to recruiting.
You’ll develop, build and optimize machine learning, computer vision, natural language, and perception algorithms for autonomous robots. You’ll also play an integral role in the product development process and other aspects of building a company from the ground up.
Engineering at Matician
All engineers are empowered to find projects at the intersection of their interests + product needs, and work on the entire stack from HW to SW to Embedded to Firmware, etc.
Current robotics seriously lack great perception stack (i.e. detailed maps of an environment, being able to precisely locate its current position, detection of various objects, natural interactions with users). We are building camera, mic and speaker solution for such natural interactions. The domains include everything from high fidelity visual SLAM, structure-from-motion, deep learning, reinforcement learning, object detection, 3D computer vision, face detection/recognition and natural language interactions. Alongside, we are very tightly integrating all of the above with electrical and mechanical capabilities of our robots.
For example, You would get to work on creating high fidelity maps thru Visual SLAM, 3D reconstruction, mapping, navigation, etc.
Build State Machine algorithms
Optimization and acceleration of on-device algorithms
Working at Matician
We are team with intense desire to learn; be challenged to do things we did not think was possible; have extreme pride in the work we do; and are genuinely unafraid to push boundaries/think from first principles. We hate status-quo!
We have a transparent culture and team of owners who wants to build (to borrow SJ line) insanely great products that users LOVE.
We work hard. Play hard.
We cover 99% premium of the health care. Health is paramount.
We don't have specific time. We have folks arriving at 9am and leaving 6pm to folks who arrive at 12pm and leaving at 9pm. We are a you be you culture...
Once a month poker night. Frequent team events, and offsite as needed...
Interested in this company?
Skip straight to final-round interviews by applying through Triplebyte.