Keeping you up to date with the latest trends and best performing architectures in this fast evolving field in computer science. Selecting papers by comparative results, citations and influence we educate you on the latest research. Consider supporting us on Patreon.com/PapersRead for feedback and ideas.
…
continue reading
A weekly show all about audiobooks recorded at the RNIB Talking Book studios. We talk to your favourite authors and narrators, along with reviews and news about new audiobooks. Presented and produced by Robert Kirkwood, you'll find a new episode here every Friday at 1pm plus bonus content such as longer uncut interviews and episodes of our occasional extra show, The Book Group. Talking Books is a free service from RNIB giving access to over 40,000 fiction and non fiction books for adults and ...
…
continue reading
The Internets Auditory Version Of Reddit
…
continue reading
Left On Read is a podcast where I'll be talking all my bookish thoughts. Let's hang 🖤 • (Yes, I did rebrand the cover art. I wanted something simpler and less busy.) • https://instagram.com/ashleyyyreads
…
continue reading
Tune in to our daily bible readings!
…
continue reading
Welcome to We Read It One Night, the bookish comedy podcast where sisters Alison and Rachel introduce you to the next romance novel that’ll make you want to stay up all night reading. Subscribe! Follow! Rate! Review! Tell your friends about us! Instagram: @wereaditonenight Twitter: @wereaditpodcast Facebook: We Read It One Night TikTok: @wereaditonenight Email: wereaditonenight [at] gmail.com
…
continue reading
Two best friends do a deep dive into a different series, looking at one book each month and discussing its plot, characters, and themes.
…
continue reading
It’s about life, but not just life. When you hear the voice of esteemed historian Finn Melanson you’ll be hooked.
…
continue reading
Two girls - one single, one not - discuss different dating points of view and try to debunk why we do what we do while dating in the 21st century.
…
continue reading
On Read This One, we read all of our favorite children's books for you to follow along with before bed, in the car, or wherever! Its a perfect podcast for children learning to read. Have a favorite book you want us to read? Great! You can email us at readthisonepodcast@gmail.com to tell us what you want to hear. Let's discover new books together! Support this podcast: https://podcasters.spotify.com/pod/show/samuel289/support
…
continue reading
Chris and Emily discuss books and literary adventures
…
continue reading
…
continue reading
1
InstantMesh: Efficient 3D Mesh Generation from a Single Image with Sparse-view Large Reconstruction Models
20:46
20:46
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
20:46
We present InstantMesh, a feed-forward framework for instant 3D mesh generation from a single image, featuring state-of-the-art generation quality and significant training scalability. By synergizing the strengths of an off-the-shelf multiview diffusion model and a sparse-view reconstruction model based on the LRM architecture, InstantMesh is able …
…
continue reading
1
Ep. 111- Nobody Is As Hot As Harry Dresden (Storm Front)
2:08:39
2:08:39
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
2:08:39
The Year of the Dresden is finally here! Hannah and Laura are covering the first third of Jim Butcher's Storm Front and friends, this book is a romp. Laura also makes a bold claim about television shows that she watched growing up and gushes about a new favorite book. Hannah has delved into a lot of recommendations that were given to her by Laura a…
…
continue reading
1
394 - Do ONE Of These Things For $500,000
59:25
59:25
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
59:25
RED BUBBLE STORE: https://rdbl.co/2BXMEkq DISCORD: https://discord.com/invite/uWZkb2a 4:50 - Read It On Reddit 16:12 - Ask Reddit 21:59 - Today I Advice 28:40 - Shower Thoughts 34:22 - Podnapping: Mythbusters AMA - readitpodcast@gmail.com - Ask Us Anything! LET YOUR GUARD DOWN!על ידי Read It Podcasts
…
continue reading
1
367: Chioma Okereke - Water Baby & Percival Everett on The Trees: A Novel
57:44
57:44
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
57:44
Read On this week features Chioma Okereke with her coming of age audio book Water Baby set in the floating slum of Makoko in Lagos. We hear about The Trees: A Novel by Percival Everett and find new books entering the RNIB Library.על ידי RNIB Connect Radio
…
continue reading
In this episode, I talk about all the books I hauled and read in March. Mistakes were made, and my bank account cried. Let's just get into it 🤦♀️
…
continue reading
1
Episode 205 - Author Spotlight with Yulin Kuang
1:35:11
1:35:11
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
1:35:11
Welcome to Episode 205! April is National Poetry Month and we are here for it. Emily is currently reading YOU ARE HERE: Poetry in the Natural World, a new anthology edited by Ada Limón, and Chris is reading BOATS FOR WOMEN by Sandra Yannone.Since our last episode, Chris finished listening to WAKE UP WITH PURPOSE! What I’ve Learned in my First Hundr…
…
continue reading
1
Read the Bible in One Year (NLT) | April 19th
14:25
14:25
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
14:25
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
Read the Bible in One Year (NLT) | April 18th
15:34
15:34
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
15:34
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
Read the Bible in One Year (NLT) | April 17th
13:59
13:59
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
13:59
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
From Words to Numbers: Your Large Language Model Is Secretly A Capable Regressor When Given In-Context Examples
36:41
36:41
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
36:41
We analyze how well pre-trained large language models (e.g., Llama2, GPT-4, Claude 3, etc) can do linear and non-linear regression when given in-context examples, without any additional training or gradient updates. Our findings reveal that several large language models (e.g., GPT-4, Claude 3) are able to perform regression tasks with a performance…
…
continue reading
1
Ep. 94 - Hook, Line & Sinker by Tessa Bailey
1:08:26
1:08:26
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
1:08:26
Hey pals! Today, we're reeled in by listener suggestion HOOK, LINE & SINKER by Tessa Bailey, the sister sequel to IT HAPPENED ONE SUMMER. Fox and Hannah take us on a steamy friends to lovers book filled with sea shanties, the Pacific Northwest (with accompanying Twilight vibes), and plenty of angst. Enjoy the show! Ep. 38 - It Happened One Summer b…
…
continue reading
1
Read the Bible in One Year (NLT ) | April 16th
13:19
13:19
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
13:19
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
AutoCodeRover: Autonomous Program Improvement
1:00:32
1:00:32
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
1:00:32
Researchers have made significant progress in automating the software development process in the past decades. Automated techniques for issue summarization, bug reproduction, fault localization, and program repair have been built to ease the workload of developers. Recent progress in Large Language Models (LLMs) has significantly impacted the devel…
…
continue reading
1
TrustLLM: Trustworthiness in Large Language Models
2:48:17
2:48:17
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
2:48:17
Large language models (LLMs), exemplified by ChatGPT, have gained considerable attention for their excellent natural language processing capabilities. Nonetheless, these LLMs present many challenges, particularly in the realm of trustworthiness. Therefore, ensuring the trustworthiness of LLMs emerges as an important topic. This paper introduces Tru…
…
continue reading
1
Read the Bible in One Year (NLT) | April 15th
13:33
13:33
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
13:33
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
Read the Bible in One Year (NLT) | April 14th
17:10
17:10
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
17:10
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
Read the Bible in One Year (NLT) | April 13th
13:40
13:40
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
13:40
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
AniPortrait: Audio-Driven Synthesis of Photorealistic Portrait Animation
11:57
11:57
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
11:57
In this study, we propose AniPortrait, a novel framework for generating high-quality animation driven by audio and a reference portrait image. Our methodology is divided into two stages. Initially, we extract 3D intermediate representations from audio and project them into a sequence of 2D facial landmarks. Subsequently, we employ a robust diffusio…
…
continue reading
1
Read the Bible in One Year (NLT) | April 12th
17:12
17:12
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
17:12
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
Fast Timing-Conditioned Latent Audio Diffusion
43:22
43:22
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
43:22
Generating long-form 44.1kHz stereo audio from text prompts can be computationally demanding. Further, most previous works do not tackle that music and sound effects naturally vary in their duration. Our research focuses on the efficient generation of long-form, variable-length stereo music and sounds at 44.1kHz using text prompts with a generative…
…
continue reading
1
Read the Bible in One Year (NLT) | April 11th
15:01
15:01
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
15:01
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
Gaussian Head Avatar: Ultra High-fidelity Head Avatar via Dynamic Gaussians
35:11
35:11
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
35:11
Creating high-fidelity 3D head avatars has always been a research hotspot, but there remains a great challenge under lightweight sparse view setups. In this paper, we propose Gaussian Head Avatar represented by controllable 3D Gaussians for high-fidelity head avatar modeling. We optimize the neutral 3D Gaussians and a fully learned MLP-based deform…
…
continue reading
1
BONUS EPISODE- "Only one bed. 100%. All the time." Chatting About Queer Romances with Dani Finn
1:14:17
1:14:17
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
1:14:17
Hannah and Laura had so much fun talking with the wonderful Dani Finn about queer romance as a genre, queer characters, and tropes! Dani shares about their writing journey, talks publishing and some of their favorite books featuring queer love. If you're looking to add to your TBR, this is the episode you need to listen to! Be sure to follow Dani a…
…
continue reading
1
Read the Bible in One Year (NLT) | April 10th
16:12
16:12
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
16:12
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
ReFT: Representation Finetuning for Language Models
33:24
33:24
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
33:24
Parameter-efficient fine-tuning (PEFT) methods seek to adapt large models via updates to a small number of weights. However, much prior interpretability work has shown that representations encode rich semantic information, suggesting that editing representations might be a more powerful alternative. Here, we pursue this hypothesis by developing a f…
…
continue reading
1
Read the Bible in One Year (NLT) | April 9th
11:25
11:25
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
11:25
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
Long-form factuality in large language models
37:15
37:15
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
37:15
Large language models (LLMs) often generate content that contains factual errors when responding to fact-seeking prompts on open-ended topics. To benchmark a model's long-form factuality in open domains, we first use GPT-4 to generate LongFact, a prompt set comprising thousands of questions spanning 38 topics. We then propose that LLM agents can be…
…
continue reading
1
393 - Reddit Is Now A Public Company (RDDT)
1:06:50
1:06:50
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
1:06:50
RED BUBBLE STORE: https://rdbl.co/2BXMEkq DISCORD: https://discord.com/invite/uWZkb2a 1:59 - Read It On Reddit 22:28 - Ask Reddit 35:06 - Today I Advice 39:29 - Shower Thoughts 48:40 - Podnapping: r/trivia - Absurd American Trivia AMA - readitpodcast@gmail.com - Ask Us Anything! LET YOUR GUARD DOWN!על ידי Read It Podcasts
…
continue reading
1
Read the Bible in One Year (NLT) | April 8th
10:02
10:02
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
10:02
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
Read the Bible in One Year (NLT) | April 7th
17:44
17:44
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
17:44
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
Jamba: A Hybrid Transformer-Mamba Language Model
25:58
25:58
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
25:58
We present Jamba, a new base large language model based on a novel hybrid Transformer-Mamba mixture-of-experts (MoE) architecture. Specifically, Jamba interleaves blocks of Transformer and Mamba layers, enjoying the benefits of both model families. MoE is added in some of these layers to increase model capacity while keeping active parameter usage …
…
continue reading
1
Read the Bible in One Year (NLT) | April 6th
18:46
18:46
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
18:46
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
QA-LoRA: Quantization-Aware Low-Rank Adaptation of Large Language Models
36:22
36:22
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
36:22
Recently years have witnessed a rapid development of large language models (LLMs). Despite the strong ability in many language-understanding tasks, the heavy computational burden largely restricts the application of LLMs especially when one needs to deploy them onto edge devices. In this paper, we propose a quantization-aware low-rank adaptation (Q…
…
continue reading
1
366: Louise Hare - Harlem After Midnight
57:45
57:45
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
57:45
In today's show we're re-joined by Louise Hare as she tells us about the sequel to Miss Aldridge Regrets, Harlem After Midnight. Plus we delve into the archive to talk to Sara Collins about The Confessions of Frannie Langton, and find some brand new audiobooks that are also available from RNIB Talking Books.…
…
continue reading
1
Read the Bible in One Year (NLT) | April 5th
17:38
17:38
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
17:38
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
MegaBlocks: Efficient Sparse Training with Mixture-of-Experts
41:52
41:52
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
41:52
We present MegaBlocks, a system for efficient Mixture-of-Experts (MoE) training on GPUs. Our system is motivated by the limitations of current frameworks, which restrict the dynamic routing in MoE layers to satisfy the constraints of existing software and hardware. These formulations force a tradeoff between model quality and hardware efficiency, a…
…
continue reading
1
Read the Bible in One Year (NLT) | April 4th
13:43
13:43
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
13:43
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
VoiceCraft: Zero-Shot Speech Editing and Text-to-Speech in the Wild
38:32
38:32
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
38:32
We introduce VoiceCraft, a token infilling neural codec language model, that achieves state-of-the-art performance on both speech editing and zero-shot text-to-speech (TTS) on audiobooks, internet videos, and podcasts. VoiceCraft employs a Transformer decoder architecture and introduces a token rearrangement procedure that combines causal masking a…
…
continue reading
1
Indie Intermission Ep. 15- "Her power is her perseverance.” An interview with Bryan S. Glosemeyer (Before the Shattered Gates of Heaven, Vol. 1)
1:12:49
1:12:49
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
1:12:49
Indiemission #5 is coming to a close, but not without an awesome author interview! This week Hannah and Laura are thrilled to have Bryan S. Glosemeyer on the pod to talk about reading, his writing journey, and of course, Before the Shattered Gates of Heaven. They chat about the book's characters, themes, inspirations for the series and what's to co…
…
continue reading
1
Read the Bible in One Year (NLT) | April 3rd
17:41
17:41
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
17:41
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
LLMLingua-2: Data Distillation for Efficient and Faithful Task-Agnostic Prompt Compression
27:45
27:45
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
27:45
This paper focuses on task-agnostic prompt compression for better generalizability and efficiency. Considering the redundancy in natural language, existing approaches compress prompts by removing tokens or lexical units according to their information entropy obtained from a causal language model such as LLaMa-7B. The challenge is that information e…
…
continue reading
1
Ep. 93 - Always Only You by Chloe Liese
1:40:57
1:40:57
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
1:40:57
Happy Autism Acceptance Month!!! We're celebrating with another one of the books that made Alison realize she was autistic: ALWAYS ONLY YOU by Chloe Liese. And how better to celebrate this seminal book than with a seminal guest host? I'm joined by Elodie, aka the famous book club member who chose The Viscount Who Loved Me in March 2020, changed bot…
…
continue reading
1
Read the Bible in One Year (NLT) | April 2nd
17:13
17:13
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
17:13
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
392 - My House Burnt Down And It Annoyed My Neighbour
58:20
58:20
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
58:20
RED BUBBLE STORE: https://rdbl.co/2BXMEkq DISCORD: https://discord.com/invite/uWZkb2a 3:30 - Read It On Reddit 12:06 - Ask Reddit 23:26 - Today I Advice 35:28 - Shower Thoughts 45:00 - Podnapping: Kids Say The Darndest Things AMA - readitpodcast@gmail.com - Ask Us Anything! LET YOUR GUARD DOWN!על ידי Read It Podcasts
…
continue reading
1
Read the Bible in One Year (NLT) | April 1st
18:10
18:10
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
18:10
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
Read the Bible in One Year (NLT) | March 31st
15:00
15:00
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
15:00
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
Read the Bible in One Year (NLT) | March 30th
19:54
19:54
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
19:54
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
365: Neil Lancaster - The Devil You Know
57:45
57:45
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
57:45
This week Robert Kirkwood talks to former military policeman and former Met Detective, Neil Lancaster about the latest DS Max Craigie Book, The Devil You Know and how his career influenced his writing. Plus we find some brand new books out now and entering the Talking Books library.על ידי RNIB Connect Radio
…
continue reading
1
Read the Bible in One Year (NLT) | March 29th
15:53
15:53
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
15:53
Listen to our Daily Bible Readings on Youtube! Support the showעל ידי Paddy
…
continue reading
1
Evolutionary Optimization of Model Merging Recipes
29:47
29:47
נגן מאוחר יותר
נגן מאוחר יותר
רשימות
לייק
אהבתי
29:47
We present a novel application of evolutionary algorithms to automate the creation of powerful foundation models. While model merging has emerged as a promising approach for LLM development due to its cost-effectiveness, it currently relies on human intuition and domain knowledge, limiting its potential. Here, we propose an evolutionary approach th…
…
continue reading