#DataScientist, #DataEngineer, Blogger, Vlogger, Podcaster at http://DataDriven.tv .
Back @Microsoft to help customers leverage #AI Opinions mine. #武當派 fan.
I blog to help you become a better data scientist/ML engineer
Opinions are mine. All mine.
Ashita Rastogi, Senior PM, IoT Hub team, joins the IoT Show to dive into details about the advantages of this new feature as well as to demo it for viewers.
After the show, you’ll understand how you can build a solution that triggers a serverless application to deliver a notification on Microsoft Teams, when a new telemetry message is sent from your IoT Hub.
Siraj Raval interviews Vinod Khosla in the latest edition of his podcast.
Vinod Khosla is an Entrepreneur, Venture Capitalist, and Philanthropist. It was an honor to have a conversation with the Silicon Valley legend that I’ve admired for many years. Vinod co-founded Sun Microsystems over 30 years ago, a company that grew to over 36,000 employees and invented so much foundational software technology like the Java programming language, NFS, and they pretty much mainstreamed the ‘idea’ of open source. After a successful exit, he’s been using his billionaire status to invest in ambitious technologists trying to improve human life. He’s got the coolest investment portfolio I’ve seen yet, and in this hour long interview we discuss everything from AI to education to startup culture. I know that my microphone volume should be higher in this one, I’ll fix that the next podcast. Enjoy!
Time markers of our discussion topics below:
2:55 The Future of Education
4:36 Vinod’s Dream of an AI Tutor
5:50 Vinod Offers Siraj a Job
6:35 Choose your Teacher with DeepFakes
8:04 Mathematical Models
9:10 Books Vinod Loves
11:00 What is Learning?
14:00 The Flaws of Liberal Arts Degrees
16:10 Indian Culture
21:11 A Day in the Life of Vinod Khosla
23:50 Valuing Brutal Honesty
24:30 Distributed File Storage
30:30 Where are we Headed?
33:32 Vinod on Nick Bostrom
38:00 Vinod’s Rockstar Recruiting Ability
43:00 The Next Industries to Disrupt
49:00 Vinod Offers Siraj Funding for an AI Tutor
51:48 Virtual Reality
52:00 Contrarian Beliefs
54:00 Vinod’s Love of Learning
55:30 USA vs China
Abstract:We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers. Unlike recent language representation models, BERT is designed to pre-train deep bidirectional representations by jointly conditioning on both left and right context in all layers. As a result, the pre-trained BERT representations can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks, such as question answering and language inference, without substantial task-specific architecture modifications. BERT is conceptually simple and empirically powerful. It obtains new state-of-the-art results on eleven natural language processing tasks, including pushing the GLUE benchmark to 80.4% (7.6% absolute improvement), MultiNLI accuracy to 86.7 (5.6% absolute improvement) and the SQuAD v1.1 question answering Test F1 to 93.2 (1.5% absolute improvement), outperforming human performance by 2.0%.