College Humor claims to have an AI written script for the Game of Thrones finale. It’s funny to watch and reads like it was written by a bot. However, without a link to any documentation or source code, we will have to take it on faith that it’s a product of AI or random typing (like the actual GoT finale).
Here are some interesting thoughts from a frequent (and I do mean frequent) contributor to Towards Data Science. An article on Data Science once a week? I need to up my game from monthly to daily!
In any case, here are the five takeaways from a year of weekly data science writing are:
- You can learn everything you need to know to be successful in data science without formal instruction
- Data science is driven by curiosity
- Consistency is the most critical factor for improvement in any pursuit
- Data science is empirical: instead of relying on proven best methods, you have to experiment to figure out what works
- Writing about data science — or anything —is a mutually beneficial relationship as it benefits you and the entire community
Not long ago, I thought that my days writing research papers were behind me. Since getting into data science, that seems like a faulty assumption. While the field has definitely left the academic sphere and taken off in the public and commercial space, the field still has a heavy accent on academia. There are even data science conferences that require a paper be published before you can submit a talk.
Fortunately for me, Siraj Raval share his tips on how to write a research paper.
As many of you know, I have had a regular column in MSDN Magazine on UWP development for the past 18 months. However, today I am excited to announce that the column will now shift focus to Data Science, AI, and Machine Learning. This is a natural progression, given the direction of FranksWorld.com and the Data Driven podcast.
I’m even more excited to announce that the new column, called “Artificially Intelligent” is now available as of the October issue, which is online now.
Here’s a sample:
Over the last 10 years, the focus of many developer and IT organizations was the capture and storage of Big Data. During that time, the notion of what a “large” database size was grew in orders of magnitude from terabytes to petabytes. Now, in 2017, the rush is on to find insights, trends and predictions of the future based on the information buried in these large data stores. Combined with recent advancements in AI research, cloud-based analytics tools and ML algorithms, these large data stores can not only be mined, but monetized.