Katherine Druckman

@katherined@reality2.social
865 Followers
529 Following
238 Posts
Open source enthusiast, formerly of the late, great Linux Journal. Drupalist, software engineer, podcaster, decorative arts enthusiast. Ex-derby girl. Formerly https://social.librem.one/@katherined

Things I like: Open Source, Privacy, Drupal, Linux, Security. 
Podcasts: Reality 2.0, Open at Intel, co-host FLOSS Weekly.

Work: Open Source Evangelist at Intel. #IamIntel (opinions are my own)

Pronouns: She/her

#opensource #privacy #security #tech #linux #drupal #podcast
Reality 2.0 Podcasthttps://reality2cast.com
Open at Intel Podcasthttps://openatintel.podbean.com
Mehttps://www.katherinedruckman.com
Githubhttps://github.com/kdruckman
A short preview of my conversation with Andreea Munteanu of Canonical: https://www.youtube.com/shorts/ThZKG23hxOg #AI #OpenSource
Canonical's Data Science Stack and AI's Open Future #podcast #nextgenai

YouTube

Oh hey, another episode! This time I talked with Anreea Munteanu of Canonical about #DataScienceStack and why the future of AI is OPEN!

https://openatintel.podbean.com/e/canonicals-data-science-stack-and-ais-open-future/

#OpenSource #AI #DataScience #podcast

Canonical's Data Science Stack and AI's Open Future | Open at Intel

In this episode, Andreea Munteanu of Canonical discusses Data Science Stack, an out-of-the-box machine learning environment solution. Emphasizing the industry's shift to Kubernetes and cloud native applications, she outlines her vision for accessible and secure open source AI. The conversation also covers the importance of community contribution, challenges faced by data scientists, and the future of AI being open source. 00:00 Introduction  01:50 Data Science Stack Introduction 03:31 Community and Collaboration 06:30 Getting Started with Generative AI 08:56 Andreea's Journey into Data Science 10:59 The Future of AI and Open Source 14:57 Encouraging Open Source Contributions 17:28 Conclusion and Final Thoughts   Guest: Andreea Munteanu helps organizations drive scalable transformation projects with open source AI. She leads AI at Canonical, the publisher of Ubuntu. With a background in data science across industries like retail and telecommunications, she helps enterprises make data-driven decisions with AI.  

New Open at Intel Podcast! I spoke with Andrew Brown of Exam Pro about his free generative AI bootcamp for developers, #Deepseek, keeping up with AI development and its rapidly moving pace, and a lot more. Check it out!

Episode: https://openatintel.podbean.com/e/mastering-generative-ai/

Clip: https://youtube.com/shorts/ofsiK5cF1u0?feature=share

#OpenSource #AI #GenAI

Mastering Generative AI | Open at Intel

In this episode, Andrew Brown, founder of Exam Pro, joins the podcast to discuss his background in educational technology and his current endeavors in teaching tech certifications and coding boot camps. Andrew shares his excitement about generative AI and how developers can stay updated with rapidly evolving innovations like Deepseek and the Open Platform for Enterprise AI (OPEA), the importance of understanding foundational concepts, and the role of open models in democratizing AI technology. The conversation also covers the relevance of deploying custom models, integrating reliable educational strategies, and ensuring developers have the knowledge to use AI applications effectively. Andrew shares insights on his courses and offers practical advice for developers keen on diving into generative AI.   00:00 Introduction and Guest Welcome 00:19 Andrew Brown's Background and Current Work 01:10 Exciting Trends in Tech Education 02:20 Deep Dive into Generative AI 05:23 DeepSeek and AI Model Costs 07:44 Challenges and Opportunities in AI Development 09:22 Open Source AI and Developer Training 11:00 Practical Advice for Aspiring Developers 13:00 Challenges and Opportunities in AI Development 18:34 Conclusion and Final Thoughts   Resources: FREE GenAI Boot Camp Exam Pro Resources on GitHub Andrew on GitHub Guest: Andrew Brown is the founder of Exam Pro where he creates training materials for developers. He also creates free cloud certification courses for freeCodeCamp.

Another new episode of Open at Intel is out! I talked with Joshua Alphonse about small language models, #OpenSource #AI, reducing bias, and other important topics in the world of AI development. Check it out! Feedback is always welcome, and I'd love to hear about what topics you'd like to hear more about. Find us here or in your favorite podcast app: https://openatintel.podbean.com/e/breaking-down-ai-small-models-big-impacts/

#podcast #NewEpisode #SLM #LLM #GenAI

Breaking Down AI: Small Models, Big Impacts | Open at Intel

Joshua Alphonse discusses the potential of small language models, highlighting their efficiency and applicability in various domains such as financial compliance and multimedia processing. The conversation also touches upon the intersection of creativity and technology, AI's role in the future of multimedia, and the significance of open source models. Joshua emphasizes the importance of eliminating biases in AI and the exciting advancements in agentic AI and spatial AI, projecting how these innovations might shape the tech landscape in the coming years. 00:00 Introduction and Welcome 00:09 Joshua's Background and Experience 00:34 Current Projects and Innovations 03:04 The Importance of Small Language Models 06:14 Open Source and AI Ethics 10:13 Future of AI and Exciting Developments 12:20 Challenges and Controversies in AI 17:47 Conclusion and Final Thoughts Guest: Joshua Alphonse is Head of Product at PremAI. Joshua has spent his time empowering developers to create innovative solutions using cutting-edge open-source technologies. Previously, Joshua worked at Wix, leading Product and R&D engagements for their Developer Relations Team, and at Bytedance he successfully created content, tutorials, and curated events for the developer community.

Managing Kubernetes with Komodor | Open at Intel

In this episode, we speak with Udi Hofesh and Itiel Schwartz from Komodor about their roles and the mission of their company. Komodor aims to simplify Kubernetes at scale by providing tools for managing, troubleshooting, and optimizing Kubernetes clusters. They discuss the unique features of Komodor, including their approach to using AI to address Kubernetes issues and their involvement in open source projects like Helm Dashboard. The conversation also touches upon the new native integration for managing Kubernetes add-ons and the future direction of the company. 00:00 Introduction and Guest Introduction 00:27 What is Komodor? 00:59 Challenges in Kubernetes 01:32 Komodor's Unique Solutions 02:27 Target Audience and Developer Relations 06:56 Open Source Contributions 14:09 AI Integration in Komodor 18:47 New Features and Future Plans   Guests: Itiel Shwartz, CTO and Co-founder, Komodor Udi Hofesh, DevRel, Komodor

New podcast today! We chatted about #Kubernetes management, #OpenSource projects like #HelmDashboard, and when to sprinkle some #AI on your porjects and do it well. Here's a preview: https://youtube.com/shorts/qVVUS4rMkGw
Before you continue to YouTube

ROI in #OpenSource contribution is always a topic worth visiting. I spoke with Alex Scammon of G-Research about their approach to open source contribution and how it absolutely impacts the bottom line in #FinTech. Check out the full #podcast episode: https://openatintel.podbean.com/e/roi-in-open-source-contributions/

A preview: https://youtube.com/shorts/1nzLRYvqWHU

ROI in Open Source Contributions | Open at Intel

In this episode, Katherine Druckman speaks to Alex Scammon, who leads the Open Source Program Office (OSPO) at G Research. Alex discusses the company's significant contributions to open source projects and their unique operating model. He covers the success of Armada, a CNCF sandbox project for multi-cluster batch scheduling, and the considerable efforts of G Research’s OSPO, which includes 30 engineers dedicated to direct open source contributions. Alex also shares insights on the benefits of supporting open source projects, the complexities of project prioritization, and the collaborative efforts in the open source community. The episode emphasizes the importance of sustainable open source involvement and offers a glimpse into G Research's mission to use AI and ML tools to drive financial market predictions.   00:00 Introduction and Guest Welcome00:08 Overview of Alex's Role and OSPO03:27 Importance of Open Source Contributions04:37 Prioritizing Projects and G Research07:27 Challenges and Collaboration12:43 Personal Journey in Open Source18:09 Encouraging Open Source Contributions   Guest: Alex Scammon: Currently, I'm leading a large and intrepid band of open-source engineers engaged in a number of philanthropic upstream contributions on behalf of G-Research. All of our work centers around open-source data science and machine learning tools and the MLOps and HPC infrastructure to support those tools at scale. We're almost certainly hiring.... As part of this work, I'm also leading a discussion around batch scheduling on Kubernetes as the chair of the CNCF's Batch Working Group. Please reach out if this is an area of interest for you -- we'd love to have more voices at the table!

🎙️ New Open at Intel #Podcast is up! Join me in my recent chat with Mark Abrams, where we explore the intricate ecosystem of #cloudNative technologies and their impact on #edge computing. From SUSE's innovative edge offerings to the role of #AI and #openSource collaboration, gain a deeper understanding of the current landscape and what's next.

📲 Listen and subscribe in your favorite podcast player or: https://openatintel.podbean.com/e/cloud-native-at-the-edge/

Cloud Native at the Edge | Open at Intel

In this episode, Mark Abrams discusses his role at SUSE as a domain solution architect specializing in edge computing. He shares insights on leveraging Kubernetes for edge solutions, the evolution of the open source community, and the importance of contributing to open source projects. They also touch upon the complexities and opportunities in cloud native technologies, the impact of AI, and future developments in edge computing and the open source ecosystem. 00:00 Introduction and Setting the Scene 00:50 Mark's Role and Interests at KubeCon 02:08 Discussing the New Book: Cloud Native Edge Essentials 03:43 The Evolution of Kubernetes and Cloud Native 05:58 Challenges and Solutions in Edge Computing 08:01 Open Source Community and Contributions 14:42 Future of Edge and AI Integration 20:20 Conclusion and Final Thoughts Guest: Mark Abrams has been involved in developing and delivering technology solutions for over 25 years. Mark has broad experience ranging from writing code for backend services, embedded systems, and user interfaces to managing and building technical teams and field activities around pre-sales engineering. Mark founded and led a technology enterprise using distributed methodologies before the modern day cloud existed. Mark was a part of the original team that brought k3s - the lightweight kubernetes - to market. Mark is currently a proud member of the Domain Solutions Architect's team at SUSE.

One of the great takeaways from #AllThingsOpen2024 #ato2024 was learning about this excellent podcast from @katherined #Intel #AI #technology #podcast

https://podcasts.apple.com/us/podcast/open-at-intel/id1670972366

Open at Intel

Technology Podcast · 6 Seasons · Updated Biweekly

Apple Podcasts
📢 The @linuxfoundation with Harvard's Laboratory for Innovation Science, has released Census III of Free and Open Source Software – Application Libraries. 🖥️ Key insights from OpenSSF help reduce FOSS vulnerabilities and secure supply chains. Read more: https://openssf.org/press-release/2024/12/04/open-source-usage-trends-and-security-challenges-revealed-in-new-study/
Open Source Usage Trends and Security Challenges Revealed in New Study – Open Source Security Foundation