תוכן מסופק על ידי Erik Partridge. כל תוכן הפודקאסטים כולל פרקים, גרפיקה ותיאורי פודקאסטים מועלים ומסופקים ישירות על ידי Erik Partridge או שותף פלטפורמת הפודקאסט שלהם. אם אתה מאמין שמישהו משתמש ביצירה שלך המוגנת בזכויות יוצרים ללא רשותך, אתה יכול לעקוב אחר התהליך המתואר כאן https://he.player.fm/legal.
Player FM - אפליקציית פודקאסט התחל במצב לא מקוון עם האפליקציה Player FM !
AI makes us all more productive.... so why isn't revenue soaring? That's the AI Productivity Paradox. ↳ Does that mean GenAI doesn't work? ↳ Or do we all collectively stink at measuring GenAI ROI? ↳ Or are employees just pocketing that time savings? Faisal Masud is a tech veteran with answers. He's the President of HP Digital Services, and he's going to help us solve the AI Productivity Paradox. Newsletter: Sign up for our free daily newsletter More on this Episode: Episode Page Join the discussion: Have a question? Join the convo here. Upcoming Episodes: Check out the upcoming Everyday AI Livestream lineup Website: YourEverydayAI.com Email The Show: info@youreverydayai.com Connect with Jordan on LinkedIn Topics Covered in This Episode: AI Productivity Paradox Explained Hybrid Work's AI Integration Challenges Generative AI Impact on Large Enterprises Raising Productivity Expectations in AI Era AI Tools vs. Traditional Employment Roles Effective AI Policy Implementation in Enterprises Building Internal AI Capabilities Strategy Insights from AI-Based Easy Button History Timestamps: 00:00 "Your Daily AI Insight Hub" 03:43 Workforce Experience Platform Overview 07:46 High Hiring Bar Enhances Productivity 10:31 Enterprise Productivity Lag with GenAI 15:53 AI Integration in Workflows 19:01 Raising Expectations in Tech Management 21:57 Hiring for Future-Ready Roles 25:23 Staples' Innovative "Easy Button" Strategy 27:22 Less is More for Success Keywords: AI productivity paradox, generative AI, productivity improvements, employee experience, HP Digital Services, hybrid work, employee productivity, generative AI wave, AI tools, workforce experience platform, AI PCs, employee sentiment data, hybrid work challenges, generative AI boom, overemployment, AI policy, large enterprises, business leaders, remote work, Microsoft Copilot, improved productivity, customer experience, agentic workflows, AI-enabled tasks, augmented roles, future of work, AI solutions, digital transformation, management challenges, augmented society, productivity metrics, less is more approach, efficient work processes. Send Everyday AI and Jordan a text message. (We can't reply back unless you leave contact info) Ready for ROI on GenAI? Go to youreverydayai.com/partner…
תוכן מסופק על ידי Erik Partridge. כל תוכן הפודקאסטים כולל פרקים, גרפיקה ותיאורי פודקאסטים מועלים ומסופקים ישירות על ידי Erik Partridge או שותף פלטפורמת הפודקאסט שלהם. אם אתה מאמין שמישהו משתמש ביצירה שלך המוגנת בזכויות יוצרים ללא רשותך, אתה יכול לעקוב אחר התהליך המתואר כאן https://he.player.fm/legal.
Short, simple summaries of machine learning topics, to help you prepare for exams, interviews, reading the latest papers, or just a quick brush up. In less than two minutes, we'll cover the most obscure jargon and complex topics in machine learning. For more details, including small animated presentations, please visit erikpartridge.com. Please do join the conversation on Twitter for corrections, conversations, support and more at #mlbytes
תוכן מסופק על ידי Erik Partridge. כל תוכן הפודקאסטים כולל פרקים, גרפיקה ותיאורי פודקאסטים מועלים ומסופקים ישירות על ידי Erik Partridge או שותף פלטפורמת הפודקאסט שלהם. אם אתה מאמין שמישהו משתמש ביצירה שלך המוגנת בזכויות יוצרים ללא רשותך, אתה יכול לעקוב אחר התהליך המתואר כאן https://he.player.fm/legal.
Short, simple summaries of machine learning topics, to help you prepare for exams, interviews, reading the latest papers, or just a quick brush up. In less than two minutes, we'll cover the most obscure jargon and complex topics in machine learning. For more details, including small animated presentations, please visit erikpartridge.com. Please do join the conversation on Twitter for corrections, conversations, support and more at #mlbytes
K-fold cross validation is the practice by which we separate a large data set into smaller pieces, independently process each data set, and then train our models on some number of the segments, and validate it on the rest. This is generally considered a best practice, or at least good practice, in machine learning, as it helps ensure the correct characterization of your model on the validation set. Machine Learning Mastery has a great post on the topic . --- Send in a voice message: https://podcasters.spotify.com/pod/show/mlbytes/message…
Stratified sampling provides a mechanism by which to split a larger dataset into smaller pieces. While random approaches are commonly used, stratified sampling ensures a relatively consistent distribution. This can result in an unwanted loss of variance, but can also beneficially reduce variance. --- Send in a voice message: https://podcasters.spotify.com/pod/show/mlbytes/message…
Boosting is also an ensemble meta-algorithm, like boosting. However, in boosting we teach a large number of weak, but specialized learners, and combine them according to their strengths. For more information on boosting, consider watching the University of Washington's great lecture on the topic . --- Send in a voice message: https://podcasters.spotify.com/pod/show/mlbytes/message…
Bagging is an ensemble meta-algorithm. Basically, we take some number of estimators (usually dozens-ish), train them each on some random subset of the training data. Then, we average the predictions of each individual estimator in order to make the resulting prediction. While this reduces the variance of your predictions (indeed, that is the core purpose of bagging), it may come at the trade off of bias. For a more academic basis, see slide #13 of this lecture by Joëlle Pineau at McGill University. --- Send in a voice message: https://podcasters.spotify.com/pod/show/mlbytes/message…
The bias-variance trade-off is a key problem in your model search. While bias represents how well your model can capture the salient details of a problem, and generally correlates with more complex algorithms, it comes at the trade off of variance. Variance is the degree to which on individual predictions your estimators stray from the mean output on those values. High variance means that a model has overfit, and incorrectly or incompletely learned the problem from the training set. Most commonly, high bias = underfitting, high variance = overfitting. Please consider joining the conversation on Twitter . I also blog from time to time. You can find me at erikpartridge.com . For more academic sources, consider reading the slides from this fantastic Carnegie Mellon lecture . --- Send in a voice message: https://podcasters.spotify.com/pod/show/mlbytes/message…
The concept of empirical risk minimization drives modern approaches to training many machine learning algorithms, including deep neural networks. Today's thirty second summary covers the basics of what you need to know, but the concept goes well beyond just the simple case we discuss today. If you are looking to discuss the topic further, please consider joining the conversation on Twitter . Lecture notes from Carnegie Mellon University (no affiliation). --- Send in a voice message: https://podcasters.spotify.com/pod/show/mlbytes/message…
ברוכים הבאים אל Player FM!
Player FM סורק את האינטרנט עבור פודקאסטים באיכות גבוהה בשבילכם כדי שתהנו מהם כרגע. זה יישום הפודקאסט הטוב ביותר והוא עובד על אנדרואיד, iPhone ואינטרנט. הירשמו לסנכרון מנויים במכשירים שונים.