t is without a doubt that Artificial Intelligence (AI) has taken the world by storm. Although this field used to be a very arcane domain, after the development of data science, it has refurbished its image and has won the respect and admiration of people all over the tech world. Namely, since Prof. Hinton’s novel approach to neural networks (his deep learning networks), A.I. has seen a huge boost in popularity and has been embraced irrevocably by the data science community. Living up to the hype of easy and robust predictive analytics models that require little if any domain knowledge, they have delivered where all other models have failed. This makes one wonder what the next evolution step will be.
Contrary to what many people think, deep learning and similar A.I. technologies that constitute the fringe of A.I. today, are not all about GPUs, although this kind of cheap computing power module definitely plays an important role in this field. Beyond the raw computation capabilities of this hardware, A.I. is also about writing programs that employ its principles in a robust and efficient manner. Even though there are packages on this tech in every language out there, people tend to flock to either the more efficient or the more easy-to-use programming platforms. This polarization is to be expected, considering that the programming languages themselves are polarized. There are low-level languages like C, Java, and C++ that are very fast but a major pain to write code in, and there are also high-level languages like Python and R that are easy to develop scripts in, but are painfully slow when it comes to executing those scripts. Of course, the latter problem is solved to some extent with linking these languages with some distributed computing (big data) platform, such as Hadoop and Spark. That’s great, but it usually means that you have to spend the majority of your time doing ETL between the programming language you use and the big data platform, as well as a lot of data engineering to ensure that everything works well. With Spark things aren’t that bad, but it is often the case that in order to ensure better performance, you end up translating your high-level scripts into Scala, which is the language that the particular big data platform works best with. To resolve this false dichotomy some people at MIT developed Julia. I’ve talked about this language in a previous post, so this time I’d like to focus on its usefulness in A.I. Julia doesn’t need all that overhead that other languages need, in order to process big data. Also, as of late, it has become easy to use in a GPU setting, so deploying a deep learning network in Julia, doesn’t require some advanced expertise. Most importantly, as it is designed to be very fast, it is ideal for computationally expensive processes, such as those involved in a modern A.I. system. So, a language like that is more or less ideal for this kind of sophisticated models, while it also lends itself for experimenting with new A.I. systems too. But don’t take my word for it; give it a try and see for yourself!
0 Comments
Your comment will be posted after it is approved.
Leave a Reply. |
Zacharias Voulgaris, PhDPassionate data scientist with a foxy approach to technology, particularly related to A.I. Archives
April 2024
Categories
All
|