Databricks Unified Analytics Platform, from the original creators of Apache Spark™, unifies data science and engineering across the Machine Learning lifecycle from data preparation, to experimentation and deployment of ML applications.
Senior Software Engineer - Platform San Francisco, CA, United States or Silicon Valley, CA, United States
Sr. Software Engineer - Security Orchestration East Bay, CA, United States, San Francisco, CA, United States, or Silicon Valley, CA, United States
Why join us?
Databricks has gone from almost no revenue to over $100 million in annual recurring revenue in just three years, putting us among the fastest growing enterprise software companies
Experienced approximately 3x year-over-year growth in subscription revenue during the last quarter of 2018
Over 2,000 organizations globally, such as Nielsen, Hotels.com, Overstock, Bechtel, Shell and HP, are leveraging Databricks to unify data science and data engineering teams across the end-to-end data and machine learning lifecycle.
Accelerating the adoption of Databricks’ Unified Analytics Platform in 2018 was the availability of Azure Databricks, a first-party integrated Microsoft Azure service. Azure Databricks was built in collaboration with Microsoft to simplify the process of big data and AI solutions by combining the best of Databricks and Azure. Azure customers are able to get Azure Databricks from Microsoft and begin using it with the touch of a button, as they would any Azure service.
Engineering at Databricks
Brand new team being built inside the security org.
Example Goal: By the end of 2019 we will have built a system for digital asset identification that becomes the source of truth for all assets in Databricks and any new asset is assessed for security risk
Phase 1 - Planning & Prototyping: External vendors and market will be assessed, compared and purchased. Product roadmap/development plan will be laid out. Initial working prototype of system flow working in which an asset is detected/removed and is scanned and notifications are being sent.
Phase 2 - Foundations: Infrastructure, base foundation building, tests
Phase 3 - External assets: External assets are synced including (IPs, services, web, datastores) API - queryable, webhooks
Phase 4 - Internal assets: Internal assets are synced including (endpoints, cloud, services) Dashboard, reporting
Metrics goal for end of phase 4: External coverage 100% Internal coverage 80% Meantime to identify asset: < 60 minutes Meantime to identify vulnerability: < 60 minutes
Working at Databricks
Lunch everyday and happy hours on friday.
Work from Home
Interested in this company?
Skip straight to final-round interviews by applying through Triplebyte.