<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Idlespace]]></title><description><![CDATA[Art, tech, innovation.]]></description><link>https://idlespace.ca/</link><generator>Ghost 5.88</generator><lastBuildDate>Tue, 21 Apr 2026 21:35:22 GMT</lastBuildDate><atom:link href="https://idlespace.ca/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[How I Purposefully Shot Myself in the Foot Making My Latest Short Film]]></title><description><![CDATA[I made a short film for the MIT Global AI Film Hack, a global competition that brings together filmmakers, artists, and technologists. What follows is a detailed account of me choosing the most painful path at every fork in the road.]]></description><link>https://idlespace.ca/how-i-purposefully-shot-myself-in-the-foot-making-my-latest-short-film/</link><guid isPermaLink="false">69e3cc37be631e029181cd56</guid><category><![CDATA[creative]]></category><category><![CDATA[generativeAI]]></category><category><![CDATA[shortfilm]]></category><dc:creator><![CDATA[Caroline Kiessling]]></dc:creator><pubDate>Mon, 20 Apr 2026 19:21:48 GMT</pubDate><media:content url="https://idlespace.ca/content/images/2026/04/davinci_edit.png" medium="image"/><content:encoded><![CDATA[<h3 id="the-making-of-department-of-good-memories">The Making of <em>Department of Good Memories</em></h3><img src="https://idlespace.ca/content/images/2026/04/davinci_edit.png" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film"><p></p><p>There&apos;s a particular kind of creative decision-making that could be called &quot;ambitious stupidity.&quot; It&apos;s when you know something is going to be harder, you understand <em>exactly why</em> it&apos;s going to be harder, and you do it anyway. </p><p>I made a short film for the MIT Global AI Film Hack, a global competition that brings together filmmakers, artists, and technologists. What follows is a detailed account of me choosing the most painful path at every fork in the road.</p><p>The timeline was roughly two and a half weeks.</p><hr><h2 id="the-story-aka-the-first-way-i-made-my-life-harder">The Story (a.k.a. The First Way I Made My Life Harder)</h2><p>The film is called <a href="https://youtu.be/GCsWCYTblpY?ref=idlespace.ca" rel="noreferrer"><em>Department of Good Memories</em>.</a> The premise: somewhere, there exists a department that stores people&apos;s good memories. Physical, glowing phials on shelves, catalogued and retrievable. Pearl, our protagonist, is the Senior Archivist. Her job is simple: receive visitors, find their memory, hand it over, put it back when they leave. She loves this job because it feeds her need for order, structure, and everything being <em>exactly where it belongs</em>.</p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/Master_audio_01_00_40_03.png" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="1920" height="1080" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Master_audio_01_00_40_03.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Master_audio_01_00_40_03.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/Master_audio_01_00_40_03.png 1600w, https://idlespace.ca/content/images/2026/04/Master_audio_01_00_40_03.png 1920w" sizes="(min-width: 720px) 720px"></figure><p>The theme &quot;Open Your Eyes&quot; - inspired by John Berger&apos;s <em>Ways of Seeing,</em> asks us to move beyond seeing into presence and perspective. And when a phial arrives that doesn&apos;t fit, Pearl is forced to confront the idea that a single memory can mean completely different things depending on who carries it.</p><p>Cool, right? Also: kinda tricky to produce with AI tools, and we will go into detail why.</p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/inTheAisleDarkPhial.png" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="2000" height="1116" srcset="https://idlespace.ca/content/images/size/w600/2026/04/inTheAisleDarkPhial.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/inTheAisleDarkPhial.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/inTheAisleDarkPhial.png 1600w, https://idlespace.ca/content/images/2026/04/inTheAisleDarkPhial.png 2000w" sizes="(min-width: 720px) 720px"></figure><h2 id="three-decisions-i-cant-blame-anyone-else-for-the-road-to-type-2-fun">Three Decisions I Can&apos;t Blame Anyone Else For (The Road to Type 2 Fun)</h2><h3 id="decision-1-a-confined-space">Decision 1: A Confined Space</h3><p>Most recent AI films wisely choose settings with built-in visual variety: a journey through different landscapes, a forest, a city, which is convenient for continuity. </p><p>I chose a single interior. One archive, three areas within it. Every. Single. Shot. shows the same shelves, the same phials, the same wooden counter, the same brass desk lamp. An intimate piece set in tight quarters with nowhere to hide. In AI filmmaking terms, this is called &quot;asking for trouble,&quot; since there are no cutaways to a sunset to bail you out.</p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/walking2.gif" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="640" height="360" srcset="https://idlespace.ca/content/images/size/w600/2026/04/walking2.gif 600w, https://idlespace.ca/content/images/2026/04/walking2.gif 640w"></figure><h3 id="decision-2-an-emotionally-complex-story-with-ocd-protagonist">Decision 2: An Emotionally Complex Story with OCD Protagonist</h3><p>Not only did I choose a confined space, I then set a story <em>about order and precision</em> inside it. Pearl&apos;s entire character is built on her love for order and clear rules. It&apos;s the setup we need for her to be able to evolve as a character later in the film. </p><p>Getting enough control over an AI character to convey &quot;Placing this object precisely where it belongs brings me joy and peace&quot; through acting alone, feels a bit like playing a souls-like game with a broken controller. (It is hard.)</p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/precision3.gif" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="640" height="360" srcset="https://idlespace.ca/content/images/size/w600/2026/04/precision3.gif 600w, https://idlespace.ca/content/images/2026/04/precision3.gif 640w"></figure><p>And then, because apparently I have no clear concept of work-life balance, I decided to tell a story about grief, tenderness, and quiet revelation. An intimate story that requires subtle acting and nuanced emotional performances. </p><p>Even with newer models like Seedance 2.0 and Kling 3.0 this is arguably the hardest challenge in this space right now. Aaand I chose it anyway. See: ambitious stupidity.  </p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/emotion.gif" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="640" height="360" srcset="https://idlespace.ca/content/images/size/w600/2026/04/emotion.gif 600w, https://idlespace.ca/content/images/2026/04/emotion.gif 640w"></figure><h3 id="decision-3-bespoke-props-or-how-to-confuse-an-ai-with-objects-that-dont-exist">Decision 3: Bespoke Props (or: How to Confuse an AI with Objects That Don&apos;t Exist)</h3><p>The archive needed to feel like a grounded, but also slightly otherworldly space, so I designed several bespoke props that don&apos;t exist in reality. Two of the trickiest were the <strong>double-infinity clock</strong> and the <strong>glowing rainbow blossom bonsai</strong> (which also tell us more about who Pearl our main character is.)</p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/boxWithStuff_v01-1.png" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="2000" height="1116" srcset="https://idlespace.ca/content/images/size/w600/2026/04/boxWithStuff_v01-1.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/boxWithStuff_v01-1.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/boxWithStuff_v01-1.png 1600w, https://idlespace.ca/content/images/2026/04/boxWithStuff_v01-1.png 2000w" sizes="(min-width: 720px) 720px"></figure><p>I created detailed mood boards for every prop and every set area. I sourced reference images, generated the image of the props, created multi angle grids and used those grids as visual anchors during generation. </p><p>The problem with made-up objects is that the AI has no training data for them. Consistency was extremely tricky. Getting the clock to look right <em>once</em> was easy, but getting it to look the same way in five different shots from five different angles while in the same exact spot was tricky even with references.</p><figure class="kg-card kg-gallery-card kg-width-wide"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-152403-1.png" width="1352" height="1217" loading="lazy" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-19-152403-1.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-19-152403-1.png 1000w, https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-152403-1.png 1352w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://idlespace.ca/content/images/2026/04/clocks-2.png" width="2000" height="683" loading="lazy" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" srcset="https://idlespace.ca/content/images/size/w600/2026/04/clocks-2.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/clocks-2.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/clocks-2.png 1600w, https://idlespace.ca/content/images/size/w2400/2026/04/clocks-2.png 2400w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-144702-2.png" width="2000" height="1023" loading="lazy" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-19-144702-2.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-19-144702-2.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/Screenshot-2026-04-19-144702-2.png 1600w, https://idlespace.ca/content/images/size/w2400/2026/04/Screenshot-2026-04-19-144702-2.png 2400w" sizes="(min-width: 720px) 720px"></div></div></div></figure><p> But I didn&apos;t want to leave the look of this world to chance, I wanted to <em>direct</em> it, the same way you&apos;d direct a production designer on a live-action set. Eventually this became a back and forth between Photoshop and Nano Banana Pro to get the perfect image.</p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/arrangementFixes2.gif" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="960" height="540" srcset="https://idlespace.ca/content/images/size/w600/2026/04/arrangementFixes2.gif 600w, https://idlespace.ca/content/images/2026/04/arrangementFixes2.gif 960w" sizes="(min-width: 720px) 720px"></figure><hr><h2 id="time-vs-story">Time vs. Story</h2><p>Unsurprisingly <strong>50% of my time was spent on story</strong>. Not image generation, not video, not editing.</p><p>The hackathon is a sprint, and story doesn&apos;t sprint well in my opinion. You need to write something, walk away, come back the next day, realize the second act doesn&apos;t work, restructure it, realize the restructuring broke the ending, fix the ending, realize you now have thirty minutes of content that needs to be four minutes, and start trimming again.</p><p>This is the part in the AI filmmaking discourse that I think is heavily overlooked. Everyone wants to know what model you used, what resolution you generated at, what your prompt structure looks like. Rarely someone asks: &quot;How many times did you throw out your story draft?&quot; (Fourteen times. The answer is fourteen.) </p><p>&quot;AI filmmaking&quot; in the end is just <em>filmmaking</em>. You need a clear vision and make conscious decisions to get it on screen. And that will always take time.  </p><p>One example of scope management: I originally envisioned the moment where the phial &quot;turns bad&quot; as a much bigger set piece: a ripple of chaos spreading through the archive, knocking things off shelves, disrupting the order that Pearl has so carefully maintained. It was going to be visually dramatic and narratively satisfying. It also would have required generating another post-ripple design of the space that needed to remain consistent throughout the second half of the film. I turned it into a more intimate moment to solve the problem.</p><p>Mercifully, this decision served both the <em>story </em>and the<em> deadline</em>.</p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/phialGlow-3.gif" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="640" height="360" srcset="https://idlespace.ca/content/images/size/w600/2026/04/phialGlow-3.gif 600w, https://idlespace.ca/content/images/2026/04/phialGlow-3.gif 640w"></figure><hr><h2 id="the-human-touch-handwriting-as-soul">The Human Touch: Handwriting as Soul</h2><p>One of the details I&apos;m proudest of is also one of the least technically impressive: the handwritten notes and posters you see throughout the film are written by myself on actual paper to get the authenticity and personality AI couldn&apos;t produce.  </p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/IMG_0519.JPG" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="2000" height="1500" srcset="https://idlespace.ca/content/images/size/w600/2026/04/IMG_0519.JPG 600w, https://idlespace.ca/content/images/size/w1000/2026/04/IMG_0519.JPG 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/IMG_0519.JPG 1600w, https://idlespace.ca/content/images/2026/04/IMG_0519.JPG 2000w" sizes="(min-width: 720px) 720px"></figure><p>The process was: I&apos;d write a note or sign by hand, scan it, generate an image with a blank piece of paper in the right position, photoshop&apos;ed my scanned handwriting onto the blank paper in the generated image, and then use that composited image as a reference for video generation. Once I had the keyframe, the models keep the writing consistent in all videos. </p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/posters.gif" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="960" height="540" srcset="https://idlespace.ca/content/images/size/w600/2026/04/posters.gif 600w, https://idlespace.ca/content/images/2026/04/posters.gif 960w" sizes="(min-width: 720px) 720px"></figure><p>In a film made almost entirely by AI, this feels like a cool &apos;easter-egg&apos; and makes it feel very personal to me. </p><figure class="kg-card kg-gallery-card kg-width-wide"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://idlespace.ca/content/images/2026/04/BTS_01_00_41_04.png" width="1920" height="1080" loading="lazy" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" srcset="https://idlespace.ca/content/images/size/w600/2026/04/BTS_01_00_41_04.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/BTS_01_00_41_04.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/BTS_01_00_41_04.png 1600w, https://idlespace.ca/content/images/2026/04/BTS_01_00_41_04.png 1920w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://idlespace.ca/content/images/2026/04/BTS_01_03_39_16.png" width="1920" height="1080" loading="lazy" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" srcset="https://idlespace.ca/content/images/size/w600/2026/04/BTS_01_03_39_16.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/BTS_01_03_39_16.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/BTS_01_03_39_16.png 1600w, https://idlespace.ca/content/images/2026/04/BTS_01_03_39_16.png 1920w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://idlespace.ca/content/images/2026/04/BTS_01_05_21_04.png" width="1920" height="1080" loading="lazy" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" srcset="https://idlespace.ca/content/images/size/w600/2026/04/BTS_01_05_21_04.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/BTS_01_05_21_04.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/BTS_01_05_21_04.png 1600w, https://idlespace.ca/content/images/2026/04/BTS_01_05_21_04.png 1920w" sizes="(min-width: 720px) 720px"></div></div></div></figure><hr><h2 id="claude-as-creative-partner-not-creative-director">Claude as Creative Partner (Not Creative Director)</h2><p>I used Claude for three distinct roles during production. </p><h3 id="role-1-screenwriting-aid">Role 1: Screenwriting Aid </h3><h3 id="attempt-1-write-my-script-result-meh"><br><em>Attempt 1: &quot;Write My Script&quot; (Result: Meh)</em></h3><p>First, I tried having Claude help me write my script. Feeding it the rough story and asking to fill in the gaps. It was okayish but not mine, and not really getting the emotional arcs right. So I sat down to write the three-page draft myself like the old, traditionalist Millennial that I am.</p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-171943.png" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="1206" height="572" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-19-171943.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-19-171943.png 1000w, https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-171943.png 1206w" sizes="(min-width: 720px) 720px"></figure><h3 id="attempt-2-critique-my-script-result-extremely-useful"><em>Attempt 2: &quot;Critique My Script&quot; (Result: Extremely Useful)</em></h3><p>What worked much better was using Claude as a story consultant. Instead of asking it to <em>write</em>, I asked it to <em>read and critique</em> my work:</p><p>I asked questions that would help me refine the story like &quot;What do you think is the message of this film?&quot; &quot;Can you relate to the main character? And if not, explain why.&quot; &quot;Do you see any plot holes in the logic of this story world?&quot;</p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/LLM_feedback-1.png" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="1404" height="782" srcset="https://idlespace.ca/content/images/size/w600/2026/04/LLM_feedback-1.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/LLM_feedback-1.png 1000w, https://idlespace.ca/content/images/2026/04/LLM_feedback-1.png 1404w" sizes="(min-width: 720px) 720px"></figure><p>This was <em>way more</em> productive. The critique was actionable, and interestingly enough the feedback process felt smoother than with another human, because Claude stayed on topic and delivered the feedback without the social weight of being judged. Giving feedback is an art in itself and sometimes can be very personal or hurtful even if it wasn&apos;t intended that way by the sender. I found it very easy to work with Claude&apos;s matter-of-fact responses in that regard. </p><p>There is the issue that the LLMs can become just another &quot;Yes-man&quot;. I tried to avoid running into that by asking questions that are not simply answered by agreeing or disagreeing, but rather need a nuanced answer like &quot;Yes I think the main character is relatable, however in scene XY her actions seem unreasonable because Z.&quot; This way it would highlight parts in the script that are incoherent or not well executed.  <br> </p><h3 id="role-2-prompt-writing-co-pilot">Role 2: Prompt Writing Co-Pilot</h3><p>After finishing my mood boards and moving into asset creation, I used Claude to interpret my mood boards and craft prompts based on them. It did a great job at capturing the essence and I could refine the resulting prompts which sped up my workflow. </p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-152918.png" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="1770" height="978" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-19-152918.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-19-152918.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/Screenshot-2026-04-19-152918.png 1600w, https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-152918.png 1770w" sizes="(min-width: 720px) 720px"></figure><p>Once shot production started, I fed Claude all my location images, character reference sheets, and prop designs. Together we built a skill file (essentially a structured reference document) that Claude could consult when helping me write image and video generation prompts.</p><p>Instead of writing out Pearl&apos;s full description every time I could just write &quot;Pearl&quot; and Claude would expand it to the full locked character description, adjusted for what the camera would actually see in that shot.</p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/image.png" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="2000" height="1236" srcset="https://idlespace.ca/content/images/size/w600/2026/04/image.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/image.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/image.png 1600w, https://idlespace.ca/content/images/2026/04/image.png 2144w" sizes="(min-width: 720px) 720px"></figure><p>I also had Claude research prompting best practices for the specific tools I was using (Nano Banana Pro for images, Veo and Kling for video) and incorporated those findings into the skill files. </p><h3 id="role-3-dialogue-polish">Role 3: Dialogue Polish</h3><p>Since I&apos;m not a native English speaker, I used Claude to sanity-check dialogue. &quot;Does this sound like something a real person would say?&quot; The kind of thing you&apos;d ask a trusted friend who happens to be a native speaker. Except this particular friend is available at 3AM while you are rewriting dialogue on your phone in bed right before falling asleep.</p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/LLM_dialogue.png" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="1336" height="723" srcset="https://idlespace.ca/content/images/size/w600/2026/04/LLM_dialogue.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/LLM_dialogue.png 1000w, https://idlespace.ca/content/images/2026/04/LLM_dialogue.png 1336w" sizes="(min-width: 720px) 720px"></figure><hr><h2 id="the-other-tools">The Other Tools </h2><h3 id="organization-miro-as-mission-control">Organization: Miro as Mission Control</h3><p>I love Miro to organize everything from mood boards, storyboard sketches to script pages with handwritten annotations in the margins. Everything is accessible at one glance.</p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-183354.png" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="2000" height="1283" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-19-183354.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-19-183354.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/Screenshot-2026-04-19-183354.png 1600w, https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-183354.png 2222w" sizes="(min-width: 720px) 720px"></figure><h3 id="image-generation-nano-banana-pro">Image Generation: Nano Banana Pro</h3><p>Virtually all images were generated using Google&apos;s Nano Banana Pro, and 99.9% of video clips started from an image, not from text alone. Image-to-video gives you dramatically more control than text-to-video, especially when your continuity requirements are high. </p><p>A few things I learned and would recommend:</p><p><strong>Mood boards as input work brilliantly.</strong> Nano Banana Pro is great at reading mood boards and grid images. For the phial-lined shelves, I&apos;d feed it a mood board showing the exact phial props that should be populated throughout the archive. </p><figure class="kg-card kg-gallery-card kg-width-wide"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-18-at-9.40.59-PM-1.png" width="2000" height="998" loading="lazy" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-18-at-9.40.59-PM-1.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-18-at-9.40.59-PM-1.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/Screenshot-2026-04-18-at-9.40.59-PM-1.png 1600w, https://idlespace.ca/content/images/size/w2400/2026/04/Screenshot-2026-04-18-at-9.40.59-PM-1.png 2400w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-152556.png" width="1515" height="898" loading="lazy" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-19-152556.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-19-152556.png 1000w, https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-152556.png 1515w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://idlespace.ca/content/images/2026/04/phial.jpg" width="2000" height="1444" loading="lazy" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" srcset="https://idlespace.ca/content/images/size/w600/2026/04/phial.jpg 600w, https://idlespace.ca/content/images/size/w1000/2026/04/phial.jpg 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/phial.jpg 1600w, https://idlespace.ca/content/images/2026/04/phial.jpg 2127w" sizes="(min-width: 720px) 720px"></div></div><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-18-at-9.41.11-PM.png" width="1966" height="1602" loading="lazy" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-18-at-9.41.11-PM.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-18-at-9.41.11-PM.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/Screenshot-2026-04-18-at-9.41.11-PM.png 1600w, https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-18-at-9.41.11-PM.png 1966w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-153837.png" width="2000" height="1260" loading="lazy" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-19-153837.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-19-153837.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/Screenshot-2026-04-19-153837.png 1600w, https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-153837.png 2377w" sizes="(min-width: 720px) 720px"></div></div></div></figure><p>For Pearl, I created grid images showing her from multiple angles: a portrait grid for close-ups, a full-body grid for wider shots. If you feed a full-body reference and prompt for a close-up, the model still tends to pull the composition wider than you want.</p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/charSheet_fullBody_extended.png" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="2000" height="1125" srcset="https://idlespace.ca/content/images/size/w600/2026/04/charSheet_fullBody_extended.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/charSheet_fullBody_extended.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/charSheet_fullBody_extended.png 1600w, https://idlespace.ca/content/images/2026/04/charSheet_fullBody_extended.png 2000w" sizes="(min-width: 720px) 720px"></figure><p><strong>Resolution matters (sometimes).</strong> I used Google Flow for most of my Nano Banana Pro generations, which outputs images at 1K and can be upscaled after. For shots that needed fine detail (facial close-ups with visible skin texture, or wide shots with dozens of readable phials) the 1K pixel space sometimes wasn&apos;t leading to great results. I&apos;d jump to PixVerse (one of the hackathon sponsors) for 4K Nano Banana Pro generations in those cases to get the native 4K resolution through their API access.</p><h3 id="video-generation-veo-kling">Video Generation: Veo + Kling</h3><p>Google Flow using <strong>Veo 3.1 Fast</strong> (low priority) was my default for video generation. This was an economic decision, because AI filmmaking is an iteration-heavy process. Choosing this model I could generate unlimited clips without burning credits. (requires the Google AI Ultra plan)</p><p>For shots that needed subtle emotional acting or complex movement sequences I switched to <strong>Kling 3.0 Omni</strong>. I used it through two of the sponsor platforms: TapNow and OpenArt.</p><p>Both had their strengths. TapNow&apos;s node-based workflow felt most intuitive to me and it has a lot of quality-of-life tools like the ability to make character sheet images as assets that can be added quickly. </p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-18-at-8.23.46-PM.png" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="2000" height="1008" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-18-at-8.23.46-PM.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-18-at-8.23.46-PM.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/Screenshot-2026-04-18-at-8.23.46-PM.png 1600w, https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-18-at-8.23.46-PM.png 2000w" sizes="(min-width: 720px) 720px"></figure><p>But OpenArt&apos;s Multi-Shot feature for Kling 3.0 Omni was also really cool: you could describe a jump cut from a medium shot to a close-up in a single generation, define references for both, decide the length of the shot and maintain acting consistency across both shots.</p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-18-at-8.22.37-PM.png" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="2000" height="1015" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-18-at-8.22.37-PM.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-18-at-8.22.37-PM.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/Screenshot-2026-04-18-at-8.22.37-PM.png 1600w, https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-18-at-8.22.37-PM.png 2000w" sizes="(min-width: 720px) 720px"></figure><p><strong>Expression images as supplements.</strong> For emotional close-ups, I generated separate reference images of Pearl with specific facial expressions and fed those alongside the character sheet. The expression image would guide the mood while the character sheet maintained identity. (Most of the time)</p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-18-at-8.25.16-PM.png" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="2000" height="1002" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-18-at-8.25.16-PM.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-18-at-8.25.16-PM.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/Screenshot-2026-04-18-at-8.25.16-PM.png 1600w, https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-18-at-8.25.16-PM.png 2000w" sizes="(min-width: 720px) 720px"></figure><h3 id="post-production-davinci-resolve">Post-Production: DaVinci Resolve</h3><p>All editing happened in DaVinci Resolve. There&apos;s also one &apos;VFX shot&apos; where the AI helpfully decided Pearl needed a twin sister and put two of her in the same frame. (Thanks.) I fixed this with classic masking in Resolve because it was faster and more efficient than burning more credits on a shot that was otherwise fine.</p><figure class="kg-card kg-video-card kg-width-regular" data-kg-thumbnail="https://idlespace.ca/content/media/2026/04/retouch2_thumb.jpg" data-kg-custom-thumbnail>
            <div class="kg-video-container">
                <video src="https://idlespace.ca/content/media/2026/04/retouch2.mp4" poster="https://img.spacergif.org/v1/640x360/0a/spacer.png" width="640" height="360" loop autoplay muted playsinline preload="metadata" style="background: transparent url(&apos;https://idlespace.ca/content/media/2026/04/retouch2_thumb.jpg&apos;) 50% 50% / cover no-repeat;"></video>
                <div class="kg-video-overlay">
                    <button class="kg-video-large-play-icon" aria-label="Play video">
                        <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24">
                            <path d="M23.14 10.608 2.253.164A1.559 1.559 0 0 0 0 1.557v20.887a1.558 1.558 0 0 0 2.253 1.392L23.14 13.393a1.557 1.557 0 0 0 0-2.785Z"/>
                        </svg>
                    </button>
                </div>
                <div class="kg-video-player-container kg-video-hide">
                    <div class="kg-video-player">
                        <button class="kg-video-play-icon" aria-label="Play video">
                            <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24">
                                <path d="M23.14 10.608 2.253.164A1.559 1.559 0 0 0 0 1.557v20.887a1.558 1.558 0 0 0 2.253 1.392L23.14 13.393a1.557 1.557 0 0 0 0-2.785Z"/>
                            </svg>
                        </button>
                        <button class="kg-video-pause-icon kg-video-hide" aria-label="Pause video">
                            <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24">
                                <rect x="3" y="1" width="7" height="22" rx="1.5" ry="1.5"/>
                                <rect x="14" y="1" width="7" height="22" rx="1.5" ry="1.5"/>
                            </svg>
                        </button>
                        <span class="kg-video-current-time">0:00</span>
                        <div class="kg-video-time">
                            /<span class="kg-video-duration">0:18</span>
                        </div>
                        <input type="range" class="kg-video-seek-slider" max="100" value="0">
                        <button class="kg-video-playback-rate" aria-label="Adjust playback speed">1&#xD7;</button>
                        <button class="kg-video-unmute-icon" aria-label="Unmute">
                            <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24">
                                <path d="M15.189 2.021a9.728 9.728 0 0 0-7.924 4.85.249.249 0 0 1-.221.133H5.25a3 3 0 0 0-3 3v2a3 3 0 0 0 3 3h1.794a.249.249 0 0 1 .221.133 9.73 9.73 0 0 0 7.924 4.85h.06a1 1 0 0 0 1-1V3.02a1 1 0 0 0-1.06-.998Z"/>
                            </svg>
                        </button>
                        <button class="kg-video-mute-icon kg-video-hide" aria-label="Mute">
                            <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 24 24">
                                <path d="M16.177 4.3a.248.248 0 0 0 .073-.176v-1.1a1 1 0 0 0-1.061-1 9.728 9.728 0 0 0-7.924 4.85.249.249 0 0 1-.221.133H5.25a3 3 0 0 0-3 3v2a3 3 0 0 0 3 3h.114a.251.251 0 0 0 .177-.073ZM23.707 1.706A1 1 0 0 0 22.293.292l-22 22a1 1 0 0 0 0 1.414l.009.009a1 1 0 0 0 1.405-.009l6.63-6.631A.251.251 0 0 1 8.515 17a.245.245 0 0 1 .177.075 10.081 10.081 0 0 0 6.5 2.92 1 1 0 0 0 1.061-1V9.266a.247.247 0 0 1 .073-.176Z"/>
                            </svg>
                        </button>
                        <input type="range" class="kg-video-volume-slider" max="100" value="100">
                    </div>
                </div>
            </div>
            
        </figure><h3 id="good-old-storyboarding">Good Old Storyboarding: </h3><p>Throughout the editing process, I kept drawing storyboards to fill gaps, figure out layout and scene progression. Storyboard frames help me push the edit further and figure out issues before I (worst case) spend hours getting a specific shot in Veo/Kling that eventually ends up being cut. I think the process of AI filmmaking in that regard is very close to animation, where it&apos;s very expensive to produce a shot and you can&apos;t just produce 5 hours of footage only to use 2min of that in the edit. That&apos;s just not economical. </p><figure class="kg-card kg-gallery-card kg-width-wide"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-162241.png" width="1761" height="922" loading="lazy" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-19-162241.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-19-162241.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/Screenshot-2026-04-19-162241.png 1600w, https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-162241.png 1761w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-162320.png" width="1757" height="837" loading="lazy" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-19-162320.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-19-162320.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/Screenshot-2026-04-19-162320.png 1600w, https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-162320.png 1757w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-162447.png" width="1772" height="862" loading="lazy" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-19-162447.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-19-162447.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/Screenshot-2026-04-19-162447.png 1600w, https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-162447.png 1772w" sizes="(min-width: 720px) 720px"></div></div><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-162529.png" width="1807" height="887" loading="lazy" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-19-162529.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-19-162529.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/Screenshot-2026-04-19-162529.png 1600w, https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-162529.png 1807w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-162608.png" width="1786" height="822" loading="lazy" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" srcset="https://idlespace.ca/content/images/size/w600/2026/04/Screenshot-2026-04-19-162608.png 600w, https://idlespace.ca/content/images/size/w1000/2026/04/Screenshot-2026-04-19-162608.png 1000w, https://idlespace.ca/content/images/size/w1600/2026/04/Screenshot-2026-04-19-162608.png 1600w, https://idlespace.ca/content/images/2026/04/Screenshot-2026-04-19-162608.png 1786w" sizes="(min-width: 720px) 720px"></div></div></div></figure><p>Every generation that isn&apos;t informed by a clear creative intention is a wasted generation. Therefore my storyboards were one of the most valuable assets in the entire production. (Look at me upselling my shitty little drawings.) </p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2026/04/storyboard.gif" class="kg-image" alt="How I Purposefully Shot Myself in the Foot Making My Latest Short Film" loading="lazy" width="640" height="360" srcset="https://idlespace.ca/content/images/size/w600/2026/04/storyboard.gif 600w, https://idlespace.ca/content/images/2026/04/storyboard.gif 640w"></figure><hr><p></p><h2 id="what-id-do-differently">What I&apos;d Do Differently </h2><p><strong>More time for story.</strong> The hackathon format is thrilling but it fights the creative process. Writing a good story takes time. And then, once your edit comes together, new problems reveal themselves. That&apos;s where I think the real filmmaking happens, and it&apos;s kind of impossible with such a tight deadline.</p><p><strong>Voice acting.</strong> For my previous film <a href="https://youtu.be/xBTn5tOlbrQ?si=w1EfLyHZ4GYwlk_1&amp;ref=idlespace.ca" rel="noreferrer"><em>Roots of Tomorrow</em></a>, I recorded real voice performances and ran them through ElevenLabs&apos; voice changer to get human-grounded dialogue. I wanted to do the same here but the clock ran out. AI-generated voices are increasingly impressive, but there&apos;s still a quality to real human vocal performance that&apos;s hard to replicate.</p><p><strong>Polish shots and a proper grading pass.</strong> In an ideal world, I would have gone back and pushed the acting further, fixed some continuity issues which could easily be solved by better keyframes. </p><p><strong>More skill file iteration.</strong> The Claude skill file I built for prompt generation was useful but not fully optimized. </p><p><strong>Restyling of images and videos. </strong>I experimented with restyling shots to get exact framing but it just took too long to get results. The 5% increase in quality was not worth the time spent given the hackathon&apos;s time limitations .</p><p></p><h3 id="the-summary">The Summary</h3><p>I purposefully shot myself in the foot with the story, setting, and characters I chose, but somehow made it across the finish line. With a range of new tools combined with good old human stubbornness, I managed to bring that idea I had in my mind onto the screen. And that&apos;s kinda cool I think.</p><figure class="kg-card kg-embed-card"><iframe width="200" height="113" src="https://www.youtube.com/embed/GCsWCYTblpY?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen title="Department of Good Memories"></iframe></figure><hr><p><em>Department of Good Memories was created for the MIT Global AI Film Hack 2026, Season 4. Theme: &quot;Open Your Eyes.&quot;</em></p><p><em>Tools used: Google Nano Banana Pro (image generation), Google Veo 3.1 (video generation), Kling 3.0 Omni via TapNow and OpenArt (video generation), PixVerse (Nano Banana Pro 4K image generation), Claude by Anthropic (story development, prompt engineering, dialogue polish), Miro (project management), DaVinci Resolve (editing and color), Suno (music), elevenLabs (sound).</em></p><p><em>Images generated: ~2490</em></p><p><em>Videos generated: ~2010</em></p><p><em>Shots in the edit: 192</em></p><p>Hours worked: ~205</p>]]></content:encoded></item><item><title><![CDATA[The Hyperconnectivity Trap: Why "Fast" Communication is Slowing You Down]]></title><description><![CDATA[<p>Communication has never been faster or cheaper. While we may be saving a lot on carrier pigeon feed, we have a new set of challenges in the modern day. We live in a world of constant pings, notifications, and the relentless expectation of instant replies. One might think this &quot;</p>]]></description><link>https://idlespace.ca/the-hyperconnectivity-trap/</link><guid isPermaLink="false">67fd6ce2be631e029181cb2e</guid><category><![CDATA[communication]]></category><category><![CDATA[generativeAI]]></category><category><![CDATA[project management]]></category><category><![CDATA[skill]]></category><dc:creator><![CDATA[Erin Leonard]]></dc:creator><pubDate>Tue, 15 Apr 2025 05:09:24 GMT</pubDate><media:content url="https://idlespace.ca/content/images/2025/04/ErinArticleImage.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://idlespace.ca/content/images/2025/04/ErinArticleImage.jpg" alt="The Hyperconnectivity Trap: Why &quot;Fast&quot; Communication is Slowing You Down"><p>Communication has never been faster or cheaper. While we may be saving a lot on carrier pigeon feed, we have a new set of challenges in the modern day. We live in a world of constant pings, notifications, and the relentless expectation of instant replies. One might think this &quot;hyperconnectivity&quot; is making us more productive than ever before, but is it really? Here at idlespace, a collective of engineers, artists, filmmakers, and tech enthusiasts, we&apos;ve come to a more nuanced realization: the speed at which we communicate might actually be bogging us down and making us less productive.</p><p><strong>The Illusion of Efficiency: The Perils of &quot;Fast&quot; Communication</strong></p><p>Our brains are wired for immediate responses. Evolutionary pressures molded us to react in real-time, where a moment&apos;s hesitation could mean life or death. The modern workplace, however, is not the unforgiving savannah.</p>
<!--kg-card-begin: html-->
<div style="text-align: center">
  <center>
    <img src="https://media.giphy.com/media/VEVfqy0Vu4c7xziUUN/giphy.gif" alt="The Hyperconnectivity Trap: Why &quot;Fast&quot; Communication is Slowing You Down" style="width: 50%; height: auto;">
    </center>
</div>
<!--kg-card-end: html-->
<p>Nowadays, the constant barrage of messages, emails, and Slack notifications creates an overwhelming sense of urgency, leading us to feel pressure to respond quickly. That said, some obvious pitfalls of rushing a response include:</p><ul><li><strong>Incomplete/Incorrect Information: </strong>when we provide answers without fully understanding the context or gathering all the necessary information, it leads to errors and miscommunication.</li><li><strong>Too Many Notifications: </strong>A flurry of fragmented messages creates a chaotic information landscape, making it difficult to sift through and find what&apos;s important.</li><li><strong>Increased Stress: </strong>The constant pressure to respond creates a sense of anxiety and overwhelm. This not only negatively affects us, but also the people we&apos;re messaging. Who doesn&#x2019;t get stressed by those little red notification dots?</li><li><strong>Interruptions Galore: </strong>Each notification disrupts our focus, leading to reduced productivity.</li></ul><p>This is the Hyperconnectivity Trap. We mistake activity for productivity. We equate speed with efficiency.</p><p>Similar to different modes of thinking, as described in Daniel Kahneman&apos;s book &quot;Thinking, Fast and Slow&quot;, we need to differentiate between fast and slow communication. &quot;Fast&quot; is reactive and emotional; &quot;slow&quot; is thoughtful and logical. But how might one go about embracing &quot;slow&quot; communication?</p>
<!--kg-card-begin: html-->
<blockquote class="pull-quote">
  <p>We mistake activity for productivity. We equate speed with efficiency.</p>
</blockquote>
<!--kg-card-end: html-->
<p><strong>The Power of Asynchronous Communication: Embracing the &quot;Slow&quot;</strong></p><p>Here&apos;s a story to illustrate how I&apos;ve used slow communication. Recently, our team faced a challenge: a critical member was traveling to a country in a substantially different time zone. In the past, this time difference had been a source of frustration, leading to fragmented communication and delays. This time, we decided to turn it into an advantage.</p><p>Instead of bombarding our traveling colleague with a constant barrage of messages throughout the day, we adopted a new approach. The team routed all communication through me. I compiled updates, questions, and action items into a single, comprehensive message in Slack while he was sleeping. This message evolved throughout <em>our</em> day, and I sent it before logging off in the evening.</p><p>The result? When our colleague woke up, he received one clear, concise summary instead of a chaotic flood of notifications. This improved his focus, made it easier for him to respond, and ultimately contributed to the project&apos;s success.</p><p><strong>Nurturing Growth: The &quot;Slow&quot; Approach to Mentorship</strong></p><p>I&apos;ve also seen the benefits of &quot;slow&quot; communication in mentoring junior colleagues. Fresh graduates often come to the workplace with a flurry of questions, which is perfectly understandable. However, constant interruptions can disrupt workflow for both the mentor and the mentee.</p><p>To address this, I implemented a structured communication schedule. With very new juniors, we had a dedicated &quot;question time&quot; twice a day. More autonomous juniors had a single daily session. The rules were simple: compile your questions, try to work through them independently, and bring them to our scheduled meeting.</p><p>The results were remarkable:</p><ul><li><strong>Efficient Workflow: </strong>Both the juniors and I experienced less disruption and greater focus.</li><li><strong>Increased Independence: </strong>The juniors often found that by persevering, they could answer many of their own questions.</li><li><strong>Enhanced Learning: </strong>Scheduled meetings provided dedicated time for training and relationship building.</li><li><strong>Reduced Anxiety: </strong>Knowing they would have my undivided attention put the juniors at ease.</li></ul><p>These juniors even carried these techniques to other teams, promoting a culture of thoughtful communication.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://idlespace.ca/content/images/2025/04/communication_modes-1.png" class="kg-image" alt="The Hyperconnectivity Trap: Why &quot;Fast&quot; Communication is Slowing You Down" loading="lazy" width="1536" height="1024" srcset="https://idlespace.ca/content/images/size/w600/2025/04/communication_modes-1.png 600w, https://idlespace.ca/content/images/size/w1000/2025/04/communication_modes-1.png 1000w, https://idlespace.ca/content/images/2025/04/communication_modes-1.png 1536w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">A pretty package stands in contrast to a tornado of messages. It&apos;s giving juxtaposition.</span></figcaption></figure><p><strong>The Power of AI-Enhanced &quot;Slow&quot; Communication</strong></p><p>Embracing &quot;slow&quot; communication offers a multitude of benefits, and AI can be a powerful ally in this endeavor.</p><ul><li><strong>Accuracy and Completeness: </strong>Let&#x2019;s say your boss asks you to refresh his memory regarding the purpose of an upcoming meeting with a client. Instead of blurting out whatever you remember off the top of your head, use an LLM like Gmail&#x2019;s built-in Gemini to read through your emails and ask it to summarize that for you. It can even make you an agenda to boot! Why bother responding off the cuff when you have AI on your side?</li><li><strong>Clearer Composition: </strong>Instead of just typing a verbal diarrhea email and hitting send, ask an LLM to help compose it. One trick I like to use: I explain the message I want to write, then ask the LLM to ask me three clarification questions before generating a response. This helps identify blind spots in what could have been a poorly thought-out email.</li><li><strong>Improved Prioritization:</strong> Instead of going through your inbox by reading each individual email, use an LLM to summarize entire threads. This can help give you a better overview of your inbox, allowing you to better triage your priorities.</li><li><strong>Meeting Management: </strong>Meetings, while often necessary, can be hugely disruptive if not managed effectively. How often has a meeting derailed your entire day? Often I&apos;m caught up working on something discussed in a meeting I just had, only to realize hours later I should have been working on something else of a higher priority. AI tools can record and transcribe meetings, generating summaries and extracting action items. When action items are recorded and shared, team members can address them when their schedules allow and prioritize them against their existing workload.</li></ul><p>While all the recommendations above take a little extra time compared to hurried actions, I assure you the time taken is well worth it.</p><p><strong>The Importance of Waiting for Your &#x201C;Point-Person&#x201D;</strong></p><p>When communicating externally, I strongly recommend (nay, I implore) designating a single point of contact for the external team to communicate with. This approach offers several crucial benefits:</p><ul><li><strong>Ensuring Alignment:</strong> Designating a single point of contact helps your team internally coordinate and prevent redundant communication, maintaining a professional unified front.</li><li><strong>Maintaining Clarity:</strong> having a point-person keeps chains of communication clear, orderly and linear. This comes in handy if you ever need to revisit something months or even years down the road (I&apos;ve been there).</li></ul><p>While it might be tempting to jump in and respond, especially when you have the answer readily available, remember that patience is a virtue in external communication. Designating and respecting a single point-person fosters a professional, coordinated image, avoids unnecessary confusion, and ultimately strengthens your team&apos;s external relationships. This approach prioritizes clarity and efficiency over speed.</p><p><strong>What can you do?</strong></p><p>Now that you&#x2019;ve read all this, how can you implement these learnings into your day-to-day?</p><ol><li><strong>Embrace the Pause: </strong>Back in the day when paper products were pricey and letters languished on long journeys, you had no other choice but to reflect first and write second. Let&apos;s bring some of that back. Take a beat to reflect, gather your thoughts, and communicate with intention.</li><li><strong>Designate a Point-Person for all external communication: </strong>By establishing a clear point of contact, you streamline external interactions, minimize misunderstandings, define clear roles, and present a united front.&#xA0;</li><li><strong>Use AI to your Advantage: </strong>AI tools can be invaluable allies in promoting &quot;slow&quot; communication. By leveraging AI for tasks like email summarization, drafting assistance, and meeting management, you can free up mental bandwidth and create space for more thoughtful and deliberate communication practices.</li></ol>
<!--kg-card-begin: html-->
<div style="text-align: center">
  <center>
    <img src="https://media.giphy.com/media/xUA7aKRkiCCqaQb5jG/giphy.gif" alt="The Hyperconnectivity Trap: Why &quot;Fast&quot; Communication is Slowing You Down" style="width: 50%; height: auto;">
    </center>
</div>
<!--kg-card-end: html-->
<p>Now go forth and slay thy day!</p>]]></content:encoded></item><item><title><![CDATA[Crafting Ava - an AI Short Film]]></title><description><![CDATA[Chapter One: Let There be Light.
When I started working on Ava, I wanted to see what kind of film I could create using only AI tools. People often talk about AI filmmaking in terms of speed, about how quickly they can put something together. But what’s the point?]]></description><link>https://idlespace.ca/ava-chapter-one/</link><guid isPermaLink="false">67eb47c0be631e029181ca7b</guid><category><![CDATA[generativeAI]]></category><category><![CDATA[shortfilm]]></category><category><![CDATA[creative]]></category><dc:creator><![CDATA[Daniel Titz]]></dc:creator><pubDate>Tue, 01 Apr 2025 04:07:00 GMT</pubDate><media:content url="https://idlespace.ca/content/images/2025/04/Ava_Title_Chapter1-3.png" medium="image"/><content:encoded><![CDATA[<h3 id="chapter-one-let-there-be-light">Chapter One: Let There be Light.</h3><img src="https://idlespace.ca/content/images/2025/04/Ava_Title_Chapter1-3.png" alt="Crafting Ava - an AI Short Film"><p></p><p>As Ava is brought back to life, memories resurface, revealing a haunting truth.</p><figure class="kg-card kg-embed-card"><iframe src="https://player.vimeo.com/video/1071260669?app_id=122963" width="426" height="240" frameborder="0" allow="autoplay; fullscreen; picture-in-picture; clipboard-write; encrypted-media" title="Ava - AI short film"></iframe></figure><hr><p>When I started working on <em>Ava</em>, I wanted to see what kind of film I could create using only AI tools. People often talk about AI filmmaking in terms of speed, about how quickly they can put something together. But what&#x2019;s the point? I didn&#x2019;t mind if the process took time. What&#x2019;s important for me is the story. I wanted to craft each frame with meaning and emotional depth.</p><p>And that&#x2019;s where things started to get somewhat challenging.</p><h3 id="creating-a-world-piece-by-piece"><strong>Creating a World Piece by Piece</strong></h3><hr><p>AI image generation is often unpredictable. It&#x2019;s like playing a slot machine, where every pull gives you something slightly different. I wanted more control. I wanted consistency.</p><p>Together with <a href="https://idlespace.ca/author/caroline/" rel="noreferrer">Caroline Kiessling</a> I started by building my own image generation workflow using ComfyUI with FLUX as a base. This gave me an initial level of control. I then added my own photography to guide the light and style and make it more personal and consistent.</p><p>Creating the environments still was not easy. Having different camera angles of the same scene and making them feel like they belong together, like they exist in the same world, took effort. I used prompts together with pose references and sheets to push the compositions in the right direction. But AI often reinterprets or resists.</p><p>This process made me cheat for some scenes by upscaling sections from wider shots I had already created and using them as close-ups or jumping over the camera axis to feature a part of the environment that hadn&#x2019;t been established yet. And although I was often struggling to create a frame exactly as I envisioned it, sometimes I was surprised by random happy accidents that led to entirely new frames or compositions. Most of the time, this was rather frustrating, but sometimes it felt exciting.</p><p>In the end, I refined the shots in Photoshop, making them look as real and accurate as possible. And the more I worked on them, the less it felt like I was creating something, and more like I was fixing mistakes and cleaning up some of the AI mess.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://idlespace.ca/content/images/2025/04/Ava_Process.jpg" class="kg-image" alt="Crafting Ava - an AI Short Film" loading="lazy" width="2000" height="1705" srcset="https://idlespace.ca/content/images/size/w600/2025/04/Ava_Process.jpg 600w, https://idlespace.ca/content/images/size/w1000/2025/04/Ava_Process.jpg 1000w, https://idlespace.ca/content/images/size/w1600/2025/04/Ava_Process.jpg 1600w, https://idlespace.ca/content/images/2025/04/Ava_Process.jpg 2000w" sizes="(min-width: 720px) 720px"><figcaption><span style="white-space: pre-wrap;">The more specific the shot the more I had to adjust things myself.</span></figcaption></figure><p>At this point, I had as much control as I needed and achieving consistency seemed doable, but for sure, it was a lot of work.</p><h3 id="conjuring-ava"><strong>Conjuring Ava</strong></h3><hr><p>To shape Ava into a cohesive character, Caroline and I trained a custom LoRA model using initial generated character images and we then added it to the custom workflow. Using the LoRA, although again I had to deal with the occasional randomness, was fairly straightforward and worked quite well. But soon, I had to deal with something I wasn&#x2019;t expecting.</p><figure class="kg-card kg-image-card"><img src="https://idlespace.ca/content/images/2025/04/Ava_Consistency.jpg" class="kg-image" alt="Crafting Ava - an AI Short Film" loading="lazy" width="1556" height="880" srcset="https://idlespace.ca/content/images/size/w600/2025/04/Ava_Consistency.jpg 600w, https://idlespace.ca/content/images/size/w1000/2025/04/Ava_Consistency.jpg 1000w, https://idlespace.ca/content/images/2025/04/Ava_Consistency.jpg 1556w" sizes="(min-width: 720px) 720px"></figure><p>The more scenes I visualized, the more the line between this fictional story and my creative process itself began to blur. Ava, as a character, exists entirely because I created her. She has no agency, no voice beyond what I give her. The story paints a world where artificial humans are controlled and exploited, and as I shaped Ava&#x2019;s journey, I started feeling an odd sense of guilt. I was placing her in these situations. I was pulling the strings. Of course, she isn&#x2019;t real, but the act of crafting a character and controlling every aspect of their existence started to mirror the themes of the film in ways I hadn&#x2019;t expected.</p><p>The further I got into the process, the more I questioned where we&#x2019;re heading. The ability to conjure digital humans, to place them in any scenario we want, is both fascinating and unsettling. As it becomes easier to mirror reality, how does this shift our responsibilities as creators?&#xA0;</p><p>AI gives us an incredible amount of power&#x2014;but what do we do with it?</p>]]></content:encoded></item><item><title><![CDATA[Sing Your Way to Knowledge: Using Generative AI to Learn]]></title><description><![CDATA[I’ve been playing around with Suno, a tool that lets you create songs with artificial intelligence. While it’s not going to replace the human touch of a well-composed singer/songwriter piece, it got me thinking...]]></description><link>https://idlespace.ca/sing-your-way-to-knowledge-using-generative-ai-to-learn/</link><guid isPermaLink="false">67cf3aa153ebb9be12c07474</guid><category><![CDATA[generativeAI]]></category><category><![CDATA[tech]]></category><dc:creator><![CDATA[Caroline Kiessling]]></dc:creator><pubDate>Fri, 14 Mar 2025 22:54:48 GMT</pubDate><media:content url="https://idlespace.ca/content/images/2025/03/sunoLearning_slim2.png" medium="image"/><content:encoded><![CDATA[<img src="https://idlespace.ca/content/images/2025/03/sunoLearning_slim2.png" alt="Sing Your Way to Knowledge: Using Generative AI to Learn"><p>You know that feeling when you can remember every song lyric from 20 years ago but struggle to recall anything useful? Like how I can flawlessly belt out Avril Lavigne&#x2019;s &quot;Complicated&quot; but somehow forget my own postal code? Science says music is a memory supercharger. So, I thought: What if I used AI to make songs that help me learn?</p>
<!--kg-card-begin: html-->
<div style="text-align: center;">
  <center>
    <img src="https://media.giphy.com/media/1ioHOweNbNF1q11pna/giphy.gif" alt="Sing Your Way to Knowledge: Using Generative AI to Learn" width="50%">
    </center>
</div>
<!--kg-card-end: html-->
<h3 id="my-experiment-ai-powered-broadway-themed-learning">My Experiment: AI-Powered Broadway Themed Learning</h3><p>I&#x2019;ve been playing around with <a href="https://suno.com/?ref=idlespace.ca" rel="noreferrer"><strong>Suno</strong></a>, a tool that lets you create songs with artificial intelligence. While it&#x2019;s not going to replace the human touch of a well-composed singer/songwriter piece, it got me thinking...</p><p>Back in high school, I discovered an odd but effective study hack. To memorize my lines for drama club (shout out to my former English Drama Club buddies), I recorded them on my crappy MP3 player&#x2014;<strong>in song.</strong> It worked so well that I started using melodies for everything: phone numbers, addresses, memorizing stuff for my final physics exam, you name it.</p><p>So naturally, I had to see if I could <strong>utilize AI-generated music to help me memorize something actually useful.</strong></p><h3 id="step-1-turn-information-into-lyrics-with-chatgpt">Step 1: Turn Information into Lyrics with ChatGPT</h3><p>Since generative AI is the rabbit hole I&apos;ve been inhabiting for the past few months, I picked the concept of diffusion models for my test. I started by asking ChatGPT to break it down in a simple way. To make sure the information is indeed accurate, I also provided text from trusted educational sources.</p><p>Then, I took it one step further&#x2014;<strong>I asked it to rewrite that information into song lyrics.</strong> Since I have a soft spot for Broadway-style numbers (why simply learn when you can learn <em>dramatically</em>?), I asked for lyrics with a theatrical flair.</p>
<!--kg-card-begin: html-->
<div style="text-align: center;">
  <center>
    <img src="https://media.giphy.com/media/l378octkVFCX8ApaM/giphy.gif" alt="Sing Your Way to Knowledge: Using Generative AI to Learn" width="50%">
    </center>
</div>
<!--kg-card-end: html-->
<h3 id="step-2-bring-it-to-life-with-generative-ai">Step 2: Bring It to Life with Generative AI</h3><p>With my freshly written lyrics, I turned to Suno to do the heavy lifting on the music side. I pasted the text, wrote a prompt for the style, and let it work its AI sorcery. In seconds, I had a full song about the diffusion process, complete with harmonies and dramatic pauses that I could now listen to on repeat. </p><figure class="kg-card kg-embed-card"><iframe width="100%" height="140" style="border-radius: 12px" frameborder="0" allowfullscreen allow="autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture" loading="lazy" src="https://suno.com/embed/5c2449df-ea39-4365-9ad2-fee6d1ec437b"></iframe></figure><p><em>(Okay let&apos;s be real, it took me five tries before I was satisfied. But that&#x2019;s just because I&#x2019;m a musical theatre snob.)</em></p><h3 id="step-3-use-it-to-memorize">Step 3: Use It to Memorize</h3><p>The results? Surprisingly effective. After I listened to my AI-generated song on repeat for a while, I was able to sing along and easily recall the steps of how diffusion models work. It turns out <a href="https://theconversation.com/the-science-of-why-you-can-remember-song-lyrics-from-years-ago-204167?ref=idlespace.ca" rel="noreferrer"><strong>music-based learning isn&#x2019;t just fun, it also works.</strong></a><strong> </strong>Obviously the information in the song lyrics is very high level and just serves as an additional memory aid to understand complex relationships. </p><p>Studies strongly support the idea that music activates multiple parts of the brain and aids in recall and long-term retention. It&#x2019;s why you still remember how to sing the alphabet decades later, but struggle to retain a list of passwords. <a href="https://time.com/6167197/psychology-behind-remembering-music/?ref=idlespace.ca" rel="noreferrer">Our brains are wired to remember songs</a> so why not make the most of it?! <strong> </strong></p><ul><li><strong>Engagement</strong>: A boring topic suddenly becomes entertaining when it&#x2019;s sung.</li><li><strong>Memory Boost</strong>: Music helps with retention, thanks to rhythm and repetition.</li><li><strong>Customization</strong>: You can tailor songs to your needs&#x2014;whether you&#x2019;re learning medical terms, historical events, or even a new language.</li><li><strong>It&#x2019;s Just Fun</strong>: If I have to choose between dry textbooks and a jazz-hands-filled number about quantum physics, I know what I&#x2019;m picking.</li></ul><p><em>(Disclaimer: While it sounds like a universal solution, it might not work for everyone. Brains are complex, squishy things with their own quirks and preferred learning styles.) </em></p><h3 id="the-verdict-ai-songs-for-the-win">The Verdict: AI Songs for the Win</h3><p>For me, AI-generated music isn&apos;t replacing traditional music anytime soon, but <strong>as a mnemonic aid, it&#x2019;s brilliant.</strong> If you struggle to remember information, why not turn it into a song? Pick your favorite music style and give it a try. Who knows? Your next earworm might just be a song about the <strong>Ariolimax columbianus </strong>(it&apos;s the <a href="https://en.wikipedia.org/wiki/Ariolimax_columbianus?ref=idlespace.ca" rel="noreferrer">Pacific banana slug</a>, in case you were wondering).</p><p>Now, if you&#x2019;ll excuse me, I have to turn my grocery list into a song ...<br></p>
<!--kg-card-begin: html-->
<div style="text-align: center;">
  <center>
    <img src="https://media.giphy.com/media/d5pxuhdsiPJVIV5otl/giphy.gif" alt="Sing Your Way to Knowledge: Using Generative AI to Learn" width="50%">
    </center>
</div>
<!--kg-card-end: html-->
<p><br><br>Further reading:</p><p><a href="https://time.com/6167197/psychology-behind-remembering-music/?ref=idlespace.ca" rel="noreferrer">Why We Remember Music and Forget Everything Else </a></p><p><a href="https://theconversation.com/the-science-of-why-you-can-remember-song-lyrics-from-years-ago-204167?ref=idlespace.ca" rel="noreferrer">The science of why you can remember song lyrics from years ago</a></p><p><a href="https://magazine.hms.harvard.edu/articles/how-music-resonates-brain?ref=idlespace.ca" rel="noreferrer">How Music Resonates in the Brain</a></p><p><a href="https://med.stanford.edu/news/all-news/2007/07/music-moves-brain-to-pay-attention-stanford-study-finds.html?ref=idlespace.ca" rel="noreferrer">Music moves brain to pay attention, Stanford study finds</a></p><p><a href="https://today.usc.edu/does-music-unlock-memory/?ref=idlespace.ca" rel="noreferrer">Does music unlock memory?</a></p><p><a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC10605363/?ref=idlespace.ca" rel="noreferrer">Cognitive Crescendo: How Music Shapes the Brain&#x2019;s Structure and Function</a></p><p><a href="https://www.sciencedaily.com/releases/2024/08/240828224256.htm?ref=idlespace.ca" rel="noreferrer">Neuroscientists explore the intersection of music and memory</a></p>]]></content:encoded></item></channel></rss>