Bugcrowd, the 12-year-old crowdsourced cybersecurity startup, has secured a whopping $102 million in Series E funding. The investment was led by General Catalyst. It aims to boost Bugcrowd’s international expansion and potential achievements. The company taps into a vast database of over 500,000 hackers. It also assists organizations like OpenAI and the U.S. government in setting up and handling bug bounty programs.
Bugcrowd’s exclusive approach centers around a two-sided security marketplace. On one side, it crowdsources skilled coders who demonstrate their abilities to join the platform, while on the other side, it matches these coders with clients’ bounty programs, across technology companies and enterprises dependent on tech infrastructure.
The startup’s platform extends beyond bug bounty programs. It also offers services such as penetration testing, attack surface management, and hacker training. While AI-powered security tools have become prevalent, human hackers remain crucial. Their critical role lies in identifying gaps and vulnerabilities that automated tools might overlook.
Bugcrowd’s CEO, Dave Gerry, emphasized the company’s growth route. Although the valuation remains undisclosed, Gerry confirmed it has significantly increased since the last funding round in 2020. The funding will fuel Bugcrowd’s expansion efforts, including potential mergers and acquisitions.
As the cybersecurity landscape evolves, Bugcrowd’s reliance on both technical and human expertise highlights the importance of a balanced approach to securing digital ecosystems. The company’s mission is to create a safer online world by harnessing the collective power of ethical hackers and skilled professionals.
In conclusion, Bugcrowd’s recent Series E funding round, securing an impressive $102 million, underscores the growing importance of bug bounty programs in the cybersecurity landscape. By leveraging a vast network of over 500,000 skilled hackers, Bugcrowd aims to create a safer online world. Their unique approach, combining human expertise with technical tools, highlights the need for a stable method to securing digital ecosystems. As we continue to advance AI and security technologies, it’s crucial to recognize the value of both human and automated efforts in safeguarding our unified world.
Read More:Â How to Use Vectors, Tokens, and Embeddings to Create Natural Language Models?
Read More: CISA’s Security-by-Design Initiative Faces Challenges Amid Unrealistic Expectations