Computational Photography 15-862
Final Project: Music synchronizing and chick embryonic morphing
Yajuan Wang


Introduction:

The aim of this project is to lead a tour of exploring chick embryos using the techniques we learn from the course of Computational Photography. From the title, we could tell that it will be divided into two parts. First, create a short movie to give an introduction of chick embryos, which will include the whole picture of chick embryo, zoom in of the heart, certain vessels (vitelline) with flow visualized by fluorescent particles. After that, the movie will focus on the development of aortic arch at three specific developmental stage, which will developed into aorta and pulmonary vessels in adult. In this part, morphing techniques will be employed to give us a dynamic impression about this process. Second, a relative simple music will be selected to synchronize with the movie created in the first step. This project will serve as a summary of part of my projects conducting in the past year.

1. Certain sample images will be used in this project

Whole picture of chick embryo Chick embryonic heart tube Chick embryonic vessel with fluorescent particles
Aortic arch at Stage 18 (incubation time: 3 days) Aortic arch at Stage 21 (incubation time: 3.5 days) Aortic arch at Stage 24 (incubation time: 4 days)

Some images maybe used in this project.


Aortic arch 3D model at Stage 18 (incubation time: 3 days) Aortic arch 3D model with streamline at Stage 24 (incubation time: 4 days)

Results:

Methods:

In the first part, the tasks are listed as following:
  1. Extract the frames of each movie from these '.avi' files;get the infomation of these avi files, for example: frame per second, which helps to make a whole movie with constant heart beat. (run 'extractIm.m' in the folder of 'Initial Movies'; after running this code, I saved the frames of different movies into different folder and give correspondent names. For example, 'Heart', 'Embryos', 'vitelline', 'AA-18', 'AA-21', 'AA-24');
  2. From the file information attained from (1), we know these frames with different size and time location. Resize these frames to 350*350 resolution, and also select the frames with same time interval. (small resize code 'Imre.m' could be found in each folder mentioned in (1)).
  3. Use the codes in the folder of "CodeForMorphing" to get the connection part images. The flow chart of whole movie will be:
    Embryo---> heart----> Embryo---->vitelline vessels ----> Embryo ----> aortic arch AA-18----> Aortic arch 21 ----> Aortic arch 24
    All of these connection part of arrows need to be created by morphing code.
    The detailed explanation for morphing code could be found in 'CodeForMorphing' folder. After morphing, selected frames used for later movies would be put in different folders. For example, "Embryo2Heart", "H2E", "Embryos2Vitelline", "V2E", "E2A" (embryos to aortic arch), "AA18-2-AA21", "AA21-2-AA24"
  4. Organize all the image frames and put them into a new folder "NewAll", small code 'Allnew.m' is used to create a new movie based on the frames put in the same folder.

In the second part, synchronizing music and movie, there are several chanlledges we need to overcome, for example,
  • how to read seperate audio and video files into one file
  • how to extract the heart beating frequence from the movie
  • how to extract the music beat from the audio
  • how to synchronize them.
Therefore, there are following tasks:
  1. After we have a movie and select a music, we want to synchronize them.(music: "06- Super Trouper", movie: created from (1));
  2. Extract music beat:
    "AnalyzeWav.m", there are two parameters important for later use: "pulse" which recorded the original beat pulse number of music during the time during of movie we created in (1); "time_begin", which could record the time when certain beat pulse show up in audio music. Detailed explanation could be found in code
    Reference: Mark R. Petersen, U. of Colorado Boulder Applied Math Dept, Feb 2004;
  3. Extract heart beating frequency: "movie_crop.m". this code is used to focus on the part of video where heart is beating; then according to histogram of this part, we could count the pixels number above certain level; then we plot the waveform of this pixel number. Now, we need to human participation to count the heart beat frequency during this time. (because this movie combine differnt movies, the different illumination levels cause a difficulty to set a single threshold; if single movie is used, we could according to the waveform to set a threshold to count the pixel number.)
  4. Compare the two frequencies attained from (3) (4), we could change the music beat frequency according to heart beating frequency using code "morphAudio.m";
  5. Until now, we have audio and video with matched heart and music beat. What we need to do further is to make them to start beating at the same time and to write a new file combining audio and video to acheive our aim. "playAudioVideo.m" is used to read these two files and adjusts the audio beginning time accordding to the parameter "time_begin" attained from code "analyzeWav" (we need to rerun this code after we get new music in (4) to get the new time_begin). In this part, I need to thank all the people share their code "mmread", "mmwrite" in mathworks exchange file section, which help a lot to combine audio and video without trying to use Simulink. Thanks!

Discussion:

There is some limitation of synchronizing music with video combined by several different movies. For example, it is not completely automatic. The part of heart frequency need human participation, due to the following reasons:

  • different movie are collected under different illumination conditions. There is no single threshold could work well.
  • individual embryo's heart beat has slight difference, which will affect the synchronization result to certain extent.
However, we also apply the method to single movie (see the second movie in result part). Because it avoids above limitation, the synchronization is better.

Reference:

  • http://marsyas.sness.net/
  • http://amath.colorado.edu/pub/matlab/music/
  • http://www.mathworks.com/matlabcentral/fileexchange/8028 (mmread)
  • http://www.mathworks.com/matlabcentral/fileexchange/15881 (mmwrite)
  • SELF-ADJUSTING BEAT DETECTION AND PREDICTION IN MUSIC, Robert Harper and M.E. Jernigan Vision and Image Processing Lab, Systems Design Engineering, University of Waterloo, Waterloo, Ontario, Canada