Tips for AI Newbies with Non-Technical Backgrounds: Insights from Leandro, Hugging Face and Joshua Starmer at Uphill Conf 2024
The Uphill Conference is a conference held at the top of the Gurten mountain in the hinterland of Bern, Switzerland. Until 2019, it was a conference about front-end development, but after five years, they came back and changed the focus to artificial intelligence. The AI event took place on May 16–17, 2024, in Gürten, with speakers from the U.S., the U.K., Spain, and Germany giving talks and workshops on AI, LLM, machine learning, and more.
There were quite a few people who recognized me from the Powercoders t-shirt I was wearing, because quite a few big Swiss companies, software companies, have interns from Powercoders. Iterativ, where I’m interning right now, was also a co-organizer of the conference, but this year they’re a silver sponsor.
Why You Should Come to Uphill Conf:
1. Speaker Lineup
The organizers of Uphill Conf meticulously reviewed many videos to ensure the speakers are excellent presenters. They leveraged connections, including the developer of Hugging Face who lives in Bern, to secure top-notch speakers for the conference. Industry leaders also recommended additional speakers.
2. Balanced Mix of Theory and Practicality
The first day is packed with workshops, while the second day offers the opportunity to listen to main talks or join discussions in the lounge. It’s an excellent place to learn about AI.
3. Networking and Job Opportunities
Many companies are hiring, making this conference a great place to find a job.
++ Nice view of Bern from Gürten, delicious lunch and Apero :)
I prepared 15 questions before attending the conference and asked them to the speakers after their talks. Here are their answers.
1. Chaewon: how do you see the AI evolving over the next five years particularly in terms of new opportunities and newcomers to the field?
Josh: personally, I know I would prefer people to start thinking about having inspiration from nature and inspiration from how biological systems can reason. I think that would be fantastic in terms of jobs though. I think it’s a tool you’ve got to learn how to use right. I mean it’s sort of like learning how to use a theme shovel versus just a normal shovel.
2. Chaewon: Can you recommend any specific books, online courses, or resources that provide a good introduction to AI concepts and techniques?
Josh: For entry level, I put some practical exercise on Youtube. You can also start by giving user feedback to it. You can start here.
2. support vector https://www.youtube.com/watch?v=efR1C6CvhmE
3. XG boost from Python start to finish https://www.youtube.com/watch?v=GrJP9FLV3FE
3. Chaewon: Can you explain the difference between AI, machine learning, and deep learning in simple terms?
Josh: Artificial Intelligence (AI) has encompassed many different concepts over time. Nowadays, when people think of AI, they often think of chatbots like ChatGPT and large language models (LLMs).
Machine learning (ML) involves making predictions based on classifications and qualifications. For example, classifications might include distinguishing between wearing pants or a dress, while qualifications might include measuring the speed of typing. Essentially, ML is about learning how to make predictions based on data.
Deep learning is a subset of ML that uses neural network architectures. It involves multiple layers of neurons to improve prediction accuracy. Although deep learning used to be more prominent, the advent of transformer models has somewhat overshadowed traditional deep learning methods. While deep learning might see a resurgence, the focus is currently shifting towards transformer models, with data playing a crucial role in their success. (It’s a term out of style, according to Josh)
4. Chaewon: What are the most common misconceptions about AI that you encounter, and how do you address them?
Kyra Goud (Red Hat): The biggest misconception is that AI would replace us. There are some drawbacks to AI. Hallucinations are where you hear, see, smell, taste or feel things that appear to be real but only exist in your mind. The issue of Hallucinating, and data being cut off are one of those issues. Reality changing every day, (and AI cannot keep up with all the new information).
Bloomberg trained a GPT-3.5 class AI on their own financial data last year, which took them 10 million USD. This could imply that future investment would be done largely by investing bots.
5. Chaewon: How can professionals from non-technical backgrounds transition into AI roles, and what advice would you give them to make this shift successful?
Leandro von Werra (Hugging Face): Yeah, that’s a good question. I think it depends a little bit on where exactly you’re coming from and what your background is. If you’re a software developer, you’re in a very good position to make the transition into machine learning. Whether you want to become a researcher training models or a practitioner using those models within companies will influence your approach.
If you aim to become a power user, the tools to train, fine-tune, and evaluate models have become increasingly accessible. In just the last 12 months, the availability of these models and the tools built around them have made significant strides. Fine-tuning and evaluating models, as well as improving efficiency, have all become much more manageable.
Previously, you needed a substantial GPU or multiple chips to run these models. Now, you can run them on your laptop due to new techniques and tools. For example, I can now run a model on my MacBook that I couldn’t have imagined running before. It’s quite exciting.
If you have a bit of a technical background, it’s a great time to get into machine learning. Many of us currently working in this field haven’t been in it for a long time, so there’s not much of a head start. It’s like saying you shouldn’t get into software development in the 90s because there’s already so much going on. There’s still a lot to come and many opportunities ahead.
There are numerous online resources available, such as courses from Stanford and MIT. Personally, I like to learn a new topic by picking a project and then learning what I need to know to complete it. I’m not a great classroom learner who goes through topics systematically; I prefer to build something and learn about all the necessary aspects by researching as I go. This is my personal approach.
Here are the links of the transcription of the talks using Korean service Clovanote:
Neural Networks: Where they are now, where they came from, and where they might go
Joshua Starmer
https://clovanote.naver.com/s/QgBhATSRsCaYFuSkAYupBCS
Would you like fries with that? How we added AI-powered recommendations to Developer Hub
Kyra, Josephine (Red Hat)
https://clovanote.naver.com/s/sZ2fJVZTEP7JAAQqXjnD6CS
Transforming Your Knowledge Ecosystem: Strategies for Enhanced Personal Knowledge Management with LLMs and Second Brain Techniques
Marc Herren
https://clovanote.naver.com/s/oNEEhGkQSKTzZ7QwZvbYUdS
Build the new generation of AI Agents with Google Gemini and Angular
Gerard Sans
https://clovanote.naver.com/s/yndcFKo4auiJiQKQW7SxBWS
From ML to LLM: On-Device AI in the Browsertime
Nico Martin
https://clovanote.naver.com/s/jYu9eWthoa6PhwRTVdHQ4CS
Training Large Language Models in the Open
Leandro Von Werra (Hugging Face)
https://clovanote.naver.com/s/KGChzS7nwEimrthCgRXfG7S
Sunita Asnani from Powercoders gave a session about cultural diversity & inclusion, and Malik Tanner from Remotecoders supported the session.
Big thanks to Stacy, for inviting me to the conference :)
Speaker’s bio: Hello, I’m Chaewon Yoo from South Korea. I’m currently doing a web developer internship at Powercoders and Iterativ in Bern, working on Jobmate project. I’m currently looking for a job in Bern as an IT support/web developer/data analyst. Here’s my LinkedIn.