carpe - http://archive.pkmital.com https://archive.pkmital.com computational audiovisual augmented reality research Mon, 09 Feb 2015 20:12:21 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 C.A.R.P.E. version 0.1.1 release https://archive.pkmital.com/2015/02/09/c-a-r-p-e-version-0-1-1-release/ https://archive.pkmital.com/2015/02/09/c-a-r-p-e-version-0-1-1-release/#respond Mon, 09 Feb 2015 20:01:12 +0000 http://pkmital.com/home/?p=1887 Screen Shot 2015-02-09 at 2.59.14 PM

I’ve updated C.A.R.P.E., a graphical tool for visualizing eye-movements and processing audio/video, to include a graphical timeline (thanks to ofxTimeline by James George/YCAM), support for audio playback/scrubbing (using pkmAudioWaveform), audio saving, and various bug fixes. This release has changed some parameters of the XML file and added others. Please refer to this example XML file for how to setup your own data:

See my previous post for information on the initial release.

Please fill out the following form if you’d like to use C.A.R.P.E..:

Continue reading...

The post C.A.R.P.E. version 0.1.1 release first appeared on http://archive.pkmital.com.

]]>
https://archive.pkmital.com/2015/02/09/c-a-r-p-e-version-0-1-1-release/feed/ 0
Toolkit for Visualizing Eye-Movements and Processing Audio/Video https://archive.pkmital.com/2015/02/06/toolkit-for-visualizing-eye-movements-and-processing-audio-video/ https://archive.pkmital.com/2015/02/06/toolkit-for-visualizing-eye-movements-and-processing-audio-video/#comments Fri, 06 Feb 2015 23:53:18 +0000 http://pkmital.com/home/?p=1852 Screen Shot 2015-02-06 at 6.24.27 PM

Original video still without eye-movements and heatmap overlay copyright Dropping Knowledge Video Republic.

From 2008 – 2010, I worked on the Dynamic Images and Eye-Movements (D.I.E.M.) project, led by John Henderson, with Tim Smith and Robin Hill. We worked together to collect nearly 200 participants eye-movements on nearly 100 short films from 30 seconds to 5 minutes in length. The database is freely available and covers a wide range of film styles form advertisements, to movie and music trailers, to news clips. During my time on the project, I developed an open source toolkit, C.A.R.P.E. to complement D.I.E.M., or Computational Algorithmic Representation and Processing of Eye-movements (Tim’s idea!), for visualizing and processing the data we collected, and used it for writing up a journal paper describing a strong correlation between tightly clustered eye-movements and the motion in a scene. We also output visualizations of our entire corpus on our Vimeo channel. The project came to a halt and so did the visualization software. I’ve since picked up the ball and re-written it entirely from the ground up.

The image below shows how you can represent the movie, the motion in the scene of the movie (represented in … Continue reading...

The post Toolkit for Visualizing Eye-Movements and Processing Audio/Video first appeared on http://archive.pkmital.com.

]]>
https://archive.pkmital.com/2015/02/06/toolkit-for-visualizing-eye-movements-and-processing-audio-video/feed/ 4
Dynamic Scene Perception Eye-Movement Data Videos and Analysis https://archive.pkmital.com/2010/05/21/carpe-diem/ https://archive.pkmital.com/2010/05/21/carpe-diem/#respond Fri, 21 May 2010 11:56:22 +0000 http://pkmital.com/home/?p=293 Over the past 2 years, I have been working under the direction of Prof John M Henderson together with Dr Tim J Smith and Dr Robin Hill on the DIEM project (Dynamic Images and Eye-Movements). Our project has focused on investigating active visual cognition by eye-tracking numerous participants watching a wide-variety of short videos.

We are in the process of making all of our data freely available for research use. As well, we have also worked on tools for analyzing eye-movements during such dynamic scenes.

CARPE, or more bombastically known as Computational Algorithmic Representation and Processing of Eye-movements, allows one to begin visualizing eye-movement data together with the video data it was tracked with in a number of ways. It currently supports low-level feature visualizations, clustering of eye-movements, model selection, heat-map visualizations, blending, contour visualizations, peek-through visualizations, movie output, binocular data input, and more. The videos shown above on our Vimeo page were all created using this tool. Head over to Google code to check out the source code or download the binary. We are still in the process of stream-lining this process by creating manuals for new users and uploading more of the eye-tracking and video data so … Continue reading...

The post Dynamic Scene Perception Eye-Movement Data Videos and Analysis first appeared on http://archive.pkmital.com.

]]>
https://archive.pkmital.com/2010/05/21/carpe-diem/feed/ 0