<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:media="http://search.yahoo.com/mrss/"
	>

<channel>
	<title>EsenKa</title>
	<link>https://esenka.co</link>
	<description>EsenKa</description>
	<pubDate>Fri, 02 Jan 2026 17:48:21 +0000</pubDate>
	<generator>https://esenka.co</generator>
	<language>en</language>
	
		
	<item>
		<title>Genuary</title>
				
		<link>http://esenka.co/Genuary</link>

		<comments></comments>

		<pubDate>Fri, 02 Jan 2026 17:48:21 +0000</pubDate>

		<dc:creator>EsenKa</dc:creator>
		
		<category><![CDATA[]]></category>

		<guid isPermaLink="false">458836</guid>

		<description></description>
		
		<excerpt></excerpt>

		<!--<wfw:commentRss></wfw:commentRss>-->

	</item>
		
		
	<item>
		<title>PIP</title>
				
		<link>http://esenka.co/PIP</link>

		<comments></comments>

		<pubDate>Tue, 16 Sep 2025 08:19:06 +0000</pubDate>

		<dc:creator>EsenKa</dc:creator>
		
		<category><![CDATA[]]></category>

		<guid isPermaLink="false">457861</guid>

		<description>Esen K. Tütüncü
Purposefully Induced Psychosis (PIP)PIP is a speculative storytelling engine that embraces AI hallucinations as a form of computational imagination. Rather than suppressing “mistakes,” the system fine-tunes large language models to generate surreal, poetic, and metaphorical responses, what we call purposefully induced hallucinations.


&#60;img width="2142" height="756" width_o="2142" height_o="756" src_o="https://cortex.persona.co/t/original/i/de34ace4d37c304e05b8183a778dd0f5a8ad5ec91ec134ad7699584c76a9de4e/pip2.png" data-mid="1419225" border="0" /&#62;
PIP exists both as a conversational interface and a mixed-reality experience. Users interact with the AI through speech and gesture, while the model’s outputs are parsed into structured forms that generate dynamic 3D objects, materials, and spatial behaviors in real time. These hallucinated responses are not only spoken aloud but visualized in the user’s physical space, blending language, embodiment, and interaction.

&#60;img width="2162" height="1168" width_o="2162" height_o="1168" src_o="https://cortex.persona.co/t/original/i/f0dda1917a0c7dfcea91c0f9851a791d34750f33e8d1892914538a6d191cba99/pip3.png" data-mid="1419226" border="0" /&#62;


The project includes a full pipeline from user input to immersive output, and introduces a visual framework for tracing language model "madness", including word embedding projections that contrast metaphorical structure with dissociative, nonlinear responses.


PIP asks: what if the hallucinations of AI systems were not errors, but invitations? What kinds of stories emerge when the machine misfires on purpose?


The paper (co-authored with Kris Pilcher) was published in CHI 2025 as part of the Microsoft Tools for Thought&#38;nbsp; workshop. You can read it&#38;nbsp;here.In collaboration with Kris Pilcher and Joe Davis.</description>
		
		<excerpt>Esen K. Tütüncü Purposefully Induced Psychosis (PIP)PIP is a speculative storytelling engine that embraces AI hallucinations as a form of computational...</excerpt>

		<!--<wfw:commentRss></wfw:commentRss>-->

	</item>
		
		
	<item>
		<title>Totomi</title>
				
		<link>http://esenka.co/Totomi</link>

		<comments></comments>

		<pubDate>Thu, 13 Feb 2025 20:52:40 +0000</pubDate>

		<dc:creator>EsenKa</dc:creator>
		
		<category><![CDATA[]]></category>

		<guid isPermaLink="false">455876</guid>

		<description>Esen K. Tütüncü
“Are you tired of being tired?” "Introducing Totomi! Your new well-being buddy! It’s not just a gadget—it’s your tiny, lovable companion that makes taking care of yourself fun." 



Totomi is a physical-virtual companion, like a "well-being twin," that reflects your environment and habits.

If you’re in a polluted area, it feels bad; if you’re active and in a healthy space, it thrives.

By caring for it, you’re encouraged to make better choices, turning self-care into an interactive, emotional experience.
This project was created as part of the MIT Reality Hack’25.&#38;nbsp;
We began by selecting sensors that could provide meaningful insights in real-world contexts. An Inertial Measurement Unit (IMU) captured daily movement data, while air quality and temperature sensors provided comprehensive environmental monitoring.

The data transmission utilized a 3D-printed, battery-powered mascot equipped with an ESP32 microcontroller, which integrated with Unity through the Singularity package. We leveraged the MRUK SDK alongside Meta's XR development tools, including Hand Tracking and Audio SDKs. VFX Graphs enabled sophisticated spatial simulations of smog and weather phenomena.

All 3D assets were created using Blender, involving custom modeling, rigging, and animation to ensure a highly personalized and immersive experience.
In collaboration with Krystian Zun, Daniel Trujillo, Isaac Chiu and Marc Pettersen.</description>
		
		<excerpt>Esen K. Tütüncü “Are you tired of being tired?” "Introducing Totomi! Your new well-being buddy! It’s not just a gadget—it’s your tiny, lovable...</excerpt>

		<!--<wfw:commentRss></wfw:commentRss>-->

	</item>
		
		
	<item>
		<title>Leaf Pulse</title>
				
		<link>http://esenka.co/Leaf-Pulse</link>

		<comments></comments>

		<pubDate>Fri, 01 Nov 2024 05:54:23 +0000</pubDate>

		<dc:creator>EsenKa</dc:creator>
		
		<category><![CDATA[]]></category>

		<guid isPermaLink="false">454720</guid>

		<description>Esen K. Tütüncü
A blend of organic and synthethic impermenence, 3D scanned autumn leaves with emergent behavior. For the video you can follow this link.
&#60;img width="1080" height="1920" width_o="1080" height_o="1920" src_o="https://cortex.persona.co/t/original/i/ee3ba09f735a87773a8863d148d9869c9bcc40fa8bfa758e2a15826f46dcd624/render1.png" data-mid="1382871" border="0" /&#62;

&#60;img width="1080" height="1920" width_o="1080" height_o="1920" src_o="https://cortex.persona.co/t/original/i/dbf7fa96f3384020bd625153530adafa246a0fa88909314a68e751e27d264253/render3.png" data-mid="1382870" border="0" /&#62;&#38;nbsp;</description>
		
		<excerpt>Esen K. Tütüncü A blend of organic and synthethic impermenence, 3D scanned autumn leaves with emergent behavior. For the video you can follow this link.   &#38;nbsp;</excerpt>

		<!--<wfw:commentRss></wfw:commentRss>-->

	</item>
		
		
	<item>
		<title>Writersblock</title>
				
		<link>http://esenka.co/Writersblock</link>

		<comments></comments>

		<pubDate>Fri, 01 Nov 2024 05:47:08 +0000</pubDate>

		<dc:creator>EsenKa</dc:creator>
		
		<category><![CDATA[]]></category>

		<guid isPermaLink="false">454719</guid>

		<description>Esen K. Tütüncü

Writer’s Block draws inspiration from the vibrant streets of NYC, the iconic birthplace of hip hop culture. Step into the world of mixed reality and embody the spirit of a breakdancer or graffiti artist right in your own space. Transform your surroundings into a tribute to hip hop with legendary artwork by graffiti writer Lady Pink and the soulful beats of Madlib.




We utilized the Movement SDK to capture body poses and movements for real-time feedback while you groove to hip hop tracks by legendary producers. I can study your dance poses from any angle by pausing your avatar mid-move. We used passthrough camera and integrated body tracking while using a variety of presence platform capabilities, including Passthrough, Gestures, Hand Tracking, Movement SDK, Spatial Audio, Interactables, and Room Mesh.





This is a project done in collaboration with Kris Pilcher, Daniel Trujillo and Nick Kaufmann.


</description>
		
		<excerpt>Esen K. Tütüncü  Writer’s Block draws inspiration from the vibrant streets of NYC, the iconic birthplace of hip hop culture. Step into the world of mixed...</excerpt>

		<!--<wfw:commentRss></wfw:commentRss>-->

	</item>
		
		
	<item>
		<title>Bots brought cookies</title>
				
		<link>http://esenka.co/Bots-brought-cookies</link>

		<comments></comments>

		<pubDate>Fri, 01 Nov 2024 05:46:12 +0000</pubDate>

		<dc:creator>EsenKa</dc:creator>
		
		<category><![CDATA[]]></category>

		<guid isPermaLink="false">454718</guid>

		<description>Esen Ka
</description>
		
		<excerpt>Esen Ka</excerpt>

		<!--<wfw:commentRss></wfw:commentRss>-->

	</item>
		
		
	<item>
		<title>Electric Sheep</title>
				
		<link>http://esenka.co/Electric-Sheep</link>

		<comments></comments>

		<pubDate>Fri, 01 Nov 2024 05:46:03 +0000</pubDate>

		<dc:creator>EsenKa</dc:creator>
		
		<category><![CDATA[]]></category>

		<guid isPermaLink="false">454717</guid>

		<description>Esen K. Tütüncü Electric sheep is an interactive theatre piece that explored the moral dilemma two characters face with their encounter with a robot refugee that claims to be sentient.It&#38;nbsp; used deepfakes, BCI headstraps that controlled the visuals in real-time, and ‘Dream Interrupters’ that allowed the audience to participate in the narrative.

Developed during the Boston Tech Poetics Stage Hack, and performed in MIT Theatre and Arts.&#38;nbsp;&#60;img width="4240" height="2832" width_o="4240" height_o="2832" src_o="https://cortex.persona.co/t/original/i/19d64289d17322c5ce021cabfe2670a2b68db5f80e05fd497f8a577f44a09b4f/IMG_4722.JPG" data-mid="1382875" border="0" /&#62;&#60;img width="3024" height="4032" width_o="3024" height_o="4032" src_o="https://cortex.persona.co/t/original/i/ccbda054fa34224c6ca30a9d6437cc9cf910f9ea3124484300b5bb56f64ec774/IMG_4647.jpg" data-mid="1382877" border="0" /&#62;&#60;img width="1571" height="2000" width_o="1571" height_o="2000" src_o="https://cortex.persona.co/t/original/i/ff0e6db7e7d21b9a48ceaaf9e0dd4e755ddae7c28a1102f6304432d22d322dcc/c268a26891d05fdd73f871dc83a9a6f6.png" data-mid="1382881" border="0" /&#62;&#60;img width="3024" height="4032" width_o="3024" height_o="4032" src_o="https://cortex.persona.co/t/original/i/082cbcdef462df9f3f8c4971c295265fb7b4bf5f928ce063e0e05e8008f11070/IMG_4568.jpg" data-mid="1382878" border="0" /&#62;&#60;img width="3024" height="4032" width_o="3024" height_o="4032" src_o="https://cortex.persona.co/t/original/i/f972b19df553546d098ef9c8b01066889c99516013e358a53afe261d6f7b41fd/IMG_4667.jpg" data-mid="1382879" border="0" /&#62;</description>
		
		<excerpt>Esen K. Tütüncü Electric sheep is an interactive theatre piece that explored the moral dilemma two characters face with their encounter with a robot refugee that...</excerpt>

		<!--<wfw:commentRss></wfw:commentRss>-->

	</item>
		
		
	<item>
		<title>dynamicmodulation</title>
				
		<link>http://esenka.co/dynamicmodulation</link>

		<comments></comments>

		<pubDate>Fri, 01 Nov 2024 05:40:40 +0000</pubDate>

		<dc:creator>EsenKa</dc:creator>
		
		<category><![CDATA[]]></category>

		<guid isPermaLink="false">454716</guid>

		<description>Esen K. Tütüncü


With more and more intelligent systems surrounding us, many design decisions and
system architectures that facilitate the flow of information and the scope of interaction have become crucial for the sustainability and the longevity of the Human
Computer Interaction.&#38;nbsp;
 Although it is clear that the interaction can be done via direct input(touch screen, camera, microphone) making use of the built-in sensors to extract
implicit data from the user is a promising step in HCI that could potentially change
the way we use our phones. The purpose of this study is to investigate possible
ways in which sensors embedded into mobile devices can be used to assess cognitive
load or emotional state that dynamically modulate the device interface in order to
allow the user to better engage and interact.
 The hypothesis is that we can use the
built-smartphone camera to measure Pulse Rate Variability (PRV), and derive the
users cognitive load based on it. Furthermore, this would allow us to not only measure it real-time but to dynamically modulate the interface based on it to control
the induced cognitive load.

You can download the thesis here









&#60;img width="1453" height="818" width_o="1453" height_o="818" src_o="https://cortex.persona.co/t/original/i/82e2e23291acf6f2a4dfcec39dd38dbaa53a5af692ddbef87ab09a2a3058830c/Screenshot-2024-11-04-103009.png" data-mid="1383308" border="0" /&#62;

Figure: The flash illuminates the skin that is placed over the camera, which allows
the detection of blood flow and its intervals.
Overview of the setup: After placing the index finger on the smartphone camera,
the changes of the blood flow to the fingertip resulting into different light reflections
of the skin is recorded by the video frames.

 originally from:&#38;nbsp;

Bánhalmi, A. et al. Analysis of a pulse rate variability measurement using a
smartphone camera. Journal of healthcare engineering 2018.</description>
		
		<excerpt>Esen K. Tütüncü   With more and more intelligent systems surrounding us, many design decisions and system architectures that facilitate the flow of information...</excerpt>

		<!--<wfw:commentRss></wfw:commentRss>-->

	</item>
		
		
	<item>
		<title>Presence</title>
				
		<link>http://esenka.co/Presence</link>

		<comments></comments>

		<pubDate>Fri, 01 Nov 2024 05:40:10 +0000</pubDate>

		<dc:creator>EsenKa</dc:creator>
		
		<category><![CDATA[]]></category>

		<guid isPermaLink="false">454715</guid>

		<description>Esen K. Tütüncü
					    			

Sensorimotor contingencies (SC) refer to the rules by which we use our body to perceive. It has been argued that to
the extent that a virtual reality (VR) application affords natural SC so the greater likelihood that participants will experience Place
Illusion (PI), the illusion of ‘being there’ (a component of presence) in the virtual environment.&#38;nbsp;
However, notwithstanding numerous
studies this only has anecdotal support. 
Here we used a reinforcement learning (RL) paradigm where 26 participants experienced
a VR scenario where the RL agent could sequentially propose changes to 5 binary factors: mono or stereo vision, 3 or 6 degrees
of freedom head tracking, mono or spatialised sound, low or high display resolution, or one of two color schemes. The first 4 are
SC, whereas the last is not. Participants could reject or accept each change proposed by the RL, until convergence. 
Participants
were more likely to accept changes from low to high SC than changes to the color. Additionally, theory suggests that increased PI
should be associated with lower eye scanpath entropy. Our results show that mean entropy did decrease over time and the final
level of entropy was negatively correlated with a post exposure questionnaire-based assessment of PI.


Download the preprint here



Figure - The string quartet scenario. (A) The learning phase where participants were in a room showing some loudspeakers and practiced changing the settings, in the case shown the resolution. (B) An overview of the scenario – this image has been slightly vertically stretched for alignment purposes. (C) The participant chooses whether or not to make a change to the audio. (D) The scene is shown in the alternate scheme and the participant chooses whether or not to accept this change.



&#60;img width="1480" height="1376" width_o="1480" height_o="1376" src_o="https://cortex.persona.co/t/original/i/0e4fb8d8ce7e33cd3fa3976c5659135bdedadb38bb90702b8653457055167ffa/Screenshot-2024-11-04-091628.png" data-mid="1383286" border="0" /&#62;</description>
		
		<excerpt>Esen K. Tütüncü 					    			  Sensorimotor contingencies (SC) refer to the rules by which we use our body to perceive. It has been argued that to the extent that...</excerpt>

		<!--<wfw:commentRss></wfw:commentRss>-->

	</item>
		
		
	<item>
		<title>Small Groups</title>
				
		<link>http://esenka.co/Small-Groups</link>

		<comments></comments>

		<pubDate>Fri, 01 Nov 2024 05:39:57 +0000</pubDate>

		<dc:creator>EsenKa</dc:creator>
		
		<category><![CDATA[]]></category>

		<guid isPermaLink="false">454714</guid>

		<description>Esen K. Tütüncü

Exploring the dynamics of social interactions in virtual reality at a qualitative level adds to our understanding of how group meetings function. This study examined the influence of prior acquaintance on VR interactions. 
Groups of 3 or 4 participants, represented
by realistic self look-alike avatars, engaged in discussions on predefined themes. There were two conditions: one consisted of individuals with prior connections and the other of people meeting for the
first time in VR. 
Questionnaire responses revealed that pre-existing
acquaintances fostered a stronger sense of co-presence, associated
with higher sentiment compared to first-time encounters. Social
network analysis showed that groups with prior acquaintance had
more efficient communication patterns. These insights are important for optimizing the design and dynamics of VR interactions,
enhancing both social and professional virtual experiences. This research contributes to the development of VR environments that foster meaningful and engaging interactions, leveraging pre-existing
social bonds to improve user experience.



					    			Download the abstract here
&#60;img width="1044" height="1037" width_o="1044" height_o="1037" src_o="https://cortex.persona.co/t/original/i/ffc79faf25b54679b67afc1255b21869b4ed659c72985d7af328a635da82327c/Screenshot-2024-11-04-092122.png" data-mid="1383292" border="0" /&#62;
Pipeline:
&#60;img width="2537" height="969" width_o="2537" height_o="969" src_o="https://cortex.persona.co/t/original/i/11d778f8f3841e250368e90b80bee96134fbc43e892961954a28afc345fbd96d/Screenshot-2024-11-04-092259.png" data-mid="1383293" border="0" /&#62;</description>
		
		<excerpt>Esen K. Tütüncü  Exploring the dynamics of social interactions in virtual reality at a qualitative level adds to our understanding of how group meetings...</excerpt>

		<!--<wfw:commentRss></wfw:commentRss>-->

	</item>
		
	</channel>
</rss>