0% found this document useful (0 votes)
592 views45 pages

Audio Pre & Post Production in Films

This document discusses the stages of audio pre-production and post-production in filmmaking. It describes the key stages as development, pre-production, production, and post-production. In pre-production, crews are hired and locations and sets are prepared. Production involves shooting the film, while post-production includes editing, adding music and sound effects, and completing visual effects.

Uploaded by

suriyaprakash
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
592 views45 pages

Audio Pre & Post Production in Films

This document discusses the stages of audio pre-production and post-production in filmmaking. It describes the key stages as development, pre-production, production, and post-production. In pre-production, crews are hired and locations and sets are prepared. Production involves shooting the film, while post-production includes editing, adding music and sound effects, and completing visual effects.

Uploaded by

suriyaprakash
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 45

Audio Pre\Post Production in Films

Stages of pre- production : Stages of filmmaking


Filmmaking consists of five main stages:

• Development. The script is written and drafted into a workable blueprint for a
film.
• Pre-production. Preparations are made for the shoot, in which cast and crew are
hired, locations are selected, and sets are built.
• Production. The raw elements for the finished film are recorded.
• Post-production. The film is edited; music tracks (and songs) are composed,
performed and recorded; sound effects are designed and recorded; and any other
computer-graphic 'visual' effects are digitally added, and the film is fully
completed.
• Sales and distribution. The film is screened for potential buyers (distributors), is
picked up by a distributor and reaches its theater and/or dvd audience.

Development

This is the stage where an idea is fleshed out into a viable script. The producer of the
movie will find a story, which may come from books, plays, other films, true stories,
original ideas, etc. Once the theme, or underlying message, has been identified, a
synopsis will be prepared. This is followed by a step outline, which breaks the story
down into one-paragraph scenes, concentrating on the dramatic structure. Next, a
treatment is prepared. This is a 25 to 30 page description of the story, its mood and
characters, with little dialog and stage direction, often containing drawings to help
visualize the key points.

The screenplay is then written over a period of several months, and may be rewritten
several times to improve the dramatization, clarity, structure, characters, dialogue, and
overall style. However, producers often skip the previous steps and develop submitted
screenplays which are assessed through a process called script coverage. A film
distributor should be contacted at an early stage to assess the likely market and potential
financial success of the film. Hollywood distributors will adopt a hard-headed business
approach and consider factors such as the film genre, the target audience, the historical
success of similar films, the actors who might appear in the film and the potential
directors of the film. All these factors imply a certain appeal of the film to a possible
audience and hence the number of "bums on seats" during the theatrical release. Not all
films make a profit from the theatrical release alone, therefore DVD sales and worldwide
distribution rights need to be taken into account.
The movie pitch is then prepared and presented to potential financiers. If the pitch is
successful and the movie is given the "green light", then financial backing is offered,
typically from a major film studio, film council or independent investors. A deal is
negotiated and contracts are signed.

Pre-production

In pre-production, the movie is designed and planned. The production company is created
and a production office established. The production is storyboarded and visualized with
the help of illustrators and concept artists. A production budget will also be drawn up to
cost the film.

The producer will hire a crew. The nature of the film, and the budget, determine the size
and type of crew used during filmmaking. Many Hollywood blockbusters employ a cast
and crew of thousands while a low-budget, independent film may be made by a skeleton
crew of eight or nine. Typical crew positions include

• The director is primarily responsible for the acting in the movie and managing the
creative elements.
• The assistant director (AD) manages the shooting schedule and logistics of the
production, among other tasks.
• The casting director finds actors for the parts in the script. This normally requires
an audition by the actor. Lead actors are carefully chosen and are often based on
the actor's reputation or "star power."
• The location manager finds and manages the film locations. Most pictures are
shot in the predictable environment of a studio sound stage but occasionally
outdoor sequences will call for filming on location.
• The production manager manages the production budget and production schedule.
He or she also reports on behalf of the production office to the studio executives
or financiers of the film.
• The director of photography (DP or DOP) or cinematographer creates the
photography of the film. He or she cooperates with the director, director of
audiography (DOA) and AD.
• The art director manages the art department, which makes production sets,
costumes and provides makeup & hair styling services.
• The production designer creates the look and feel of the production sets and
props, working with the art director to create these elements.
• The storyboard artist creates visual images to help the director and production
designer communicate their ideas to the production team.
• The production sound mixer manages the audio experience during the production
stage of a film. He or she cooperates with the director, DOP, and AD.
• The sound designer creates new sounds and enhances the aural feel of the film
with the help of foley artists.
• The composer creates new music for the film.
• The choreographer creates and coordinates the movement and dance - typically
for musicals. Some films also credit a fight choreographer.
Production

In production the movie is created and shot. More crew will be recruited at this stage,
such as the property master, script supervisor, assistant directors, stills photographer,
picture editor, and sound editors. These are just the most common roles in filmmaking;
the production office will be free to create any unique blend of roles to suit a particular
film.

A typical day's shooting begins with an assistant director following the shooting schedule
for the day. The film set is constructed and the props made ready. The lighting is rigged
and the camera and sound recording equipment are set up. At the same time, the actors
are wardrobed in their costumes and attend the hair and make-up departments.

The actors rehearse their scripts and blocking with the director. The picture and sound
crews then rehearse with the actors. Finally, the action is shot in as many takes as the
director wishes.

Each take of a shot follows a slating procedure and is marked on a clapperboard, which
helps the editor keep track of the takes in post-production. The clapperboard records the
scene, take, director, director of photography, date, and name of the film written on the
front, and is displayed for the camera. The clapperboard also serves the necessary
function of providing a marker to sync up the film and the sound take. Sound is recorded
on a separate apparatus from the film and they must be synched up in post-production.

The director will then decide if the take was acceptable or not. The script supervisor and
the sound and camera teams log the take on their respective report sheets. Every report
sheet records important technical notes on each take.

When shooting is finished for the scene, the director declares a "wrap." The crew will
"strike," or dismantle, the set for that scene. The director approves the next day's shooting
schedule and a daily progress report is sent to the production office. This includes the
report sheets from continuity, sound, and camera teams. Call sheets are distributed to the
cast and crew to tell them when and where to turn up the next shooting day.

For productions using traditional photographic film, the unprocessed negative of the day's
takes are sent to the film laboratory for processing overnight. Once processed, they return
from the laboratory as dailies or rushes (film positives) and are viewed in the evening by
the director, above the line crew, and, sometimes, the cast. For productions using digital
technologies, shots are downloaded and organized on a computer for display as dailies.

When the entire film is in the can, or in the completion of the production phase, the
production office normally arranges a wrap party to thank all the cast and crew for their
efforts.

Post-production

Here the film is assembled by the film editor. The modern use of video in the filmmaking
process has resulted in two workflow variants: one using entirely film, and the other
using a mixture of film and video.

In the film workflow, the original camera film (negative) is developed and copied to a
one-light workprint (positive) for editing with a mechanical editing machine. An edge
code is recorded onto film to locate the position of picture frames. Since the development
of non-linear editing systems such as Avid, Quantel or Final Cut Pro, the film workflow
is used by very few productions.

In the video workflow, the original camera negative is developed and telecined to video
for editing with computer editing software. A timecode is recorded onto video tape to
locate the position of picture frames. Production sound is also synced up to the video
picture frames during this process.

The first job of the film editor is to build a rough cut taken from sequences (or scenes)
based on individual "takes" (shots). The purpose of the rough cut is to select and order
the best shots. The next step is to create a fine cut by getting all the shots to flow
smoothly in a seamless story. Trimming, the process of shortening scenes by a few
minutes, seconds, or even frames, is done during this phase. After the fine cut has been
screened and approved by the director and producer, the picture is "locked," meaning no
further changes are made. Next, the editor creates a negative cut list (using edge code) or
an edit decision list (using timecode) either manually or automatically. These edit lists
identify the source and the picture frame of each shot in the fine cut.

Once the picture is locked, the film passes out of the hands of the editor to the sound
department to build up the sound track. The voice recordings are synchronised and the
final sound mix is created. The sound mix combines sound effects, background sounds,
ADR, dialogue, walla, and music.
The sound track and picture are combined together, resulting in a low quality answer
print of the movie. There are now two possible workflows to create the high quality
release print depending on the recording medium:

1. In the film workflow, the cut list that describes the film-based answer print is used
to cut the original colour negative (OCN) and create a colour timed copy called
the colour master positive or interpositive print. For all subsequent steps this
effectively becomes the master copy. The next step is to create a one-light copy
called the colour duplicate negative or internegative. It is from this that many
copies of the final theatrical release print are made. Copying from the
internegative is much simpler than copying from the interpositive directly because
it is a one-light process; it also reduces wear-and-tear on the interpositive print.
2. In the video workflow, the edit decision list that describes the video-based answer
print is used to edit the original colour tape (OCT) and create a high quality
colour master tape. For all subsequent steps this effectively becomes the master
copy. The next step uses a film recorder to read the colour master tape and copy
each video frame directly to film to create the final theatrical release print.

Finally the film is previewed, normally by the target audience, and any feedback may
result in further shooting or edits to the film.

Distribution

This is the final stage, where the movie is released to cinemas or, occasionally, to DVD,
VCD or VHS (though VHS tapes are less common now that more people own DVD
players). The movie is duplicated as required for theatrical distribution. Press kits,
posters, and other advertising materials are published and the movie is advertised.

The movie will usually be launched with a launch party, press releases, interviews with
the press, showings of the film at a press preview, and/or at film festivals. It is also
common to create a website to accompany the movie. The movie will play at selected
cinemas and the DVD is typically released a few months later. The distribution rights for
the film and DVD are also usually sold for worldwide distribution. Any profits are
divided between the distributor and the production company.

1) Preparing a Sound Script :

a) Discussing film script with the director : as soon as an audiographer gets


his copy of the film script, the very first thing he is supposed to do is to go
through the entire script no matter whatever version of the draft it is (first
draft or final draft), after going through the script he should sit with the
director and discuss the entire script shot by shot (if shot breakdown is
there in the script) or scene by scene. This way a sound designer gets and
idea of what will be happening in the film sound wise and what the
director has in mind about the sound design.
b) Making a note of all the sounds possible according to the script, scenes
and shots : after the audiographer\sound designer has discussed the entire
script with the director, he is supposed to make the list of all the sounds
possibly coming in his\her mind according to his perception of the idea of
the film. He\she can make the notes of all the sounds he\she will be
recording on location or will be putting it off screen later. This process
helps in creating a very rough idea of basic sound design in his\her mind
that they will be elaborating in the post production stage.

c) Breaking down all the sounds into Digetic and Non-digetic (On Screen &
Off Screen) in the sound script according to situation, scenes and shots :
Digetic sounds are those sounds which you see happening in the visual
itself, for example if you see a school bus passing in the visual, obviously
you’ll be hearing sound of a bus passing with some children playing,
making noise etc. but if you try and put some offscreen sounds like school
bell ringing, general ambience of school, children playing, fighting, other
distant vehicles passing etc. it will make lot of difference and you can
actually feel the sound surrounding the audience and even if we switch off
the visual , still we can make out whatever action is happening. These kind
of sounds which are not heard onscreen but can be felt are called offscreen
or non-digetic sounds. These sounds help in building a sound stage and
thus contribute a lot in overall sound design of the film.

2) Introduction to some FILM TERMS :

VOCABULARY RELATING TO THE SHOT:

Take: The length of film exposed between each start and stop of the
camera. Thus, a shot that goes on for a long time without an edit is
called a "long take." During filming the same piece of action may
be filmed from the same camera setup several times (e.g., trying
for different emotions on the part of the actors); each time is called
a take.

Shot: A take, in part or in its entirety, that is used in the final edited
version of the film. In a finished film we refer to a piece of the
film between two edits as a shot. Whereas an edit can take the
story to a different time or a different place, the action within a
shot is spatially and temporally continuous. We can therefore
think of a shot as a "piece of time."
Shots are described by distance from the subject (ECU, CU, MCU,
MS, MLS, LS, ELS), by camera angle (low, high, eye-level), by
content (two-shot, three-shot, reaction shot, establishing shot), and
by any camera movement (pan, track, dolly, crane, tilt). The
average feature film contains between 400 and 1,000 shots.

Scale: The "bigness" of the subject in a given shot, determined by the


camera's distance from it.

Extreme closeup (ECU): Closer shot than a closeup; a single detail


occupies most of the screen image. e.g., a mouth, a gun.
Sometimes called an "insert."

Close-Up (CU): The camera is close to the subject, so that when


the image is projected most of the screen shows a face and its
expression, or some relatively small part of a larger whole.

Medium closeup (MCU): Shot whose scale is between MS and


CU: a character shown from the chest up.

Medium Shot (MS): A human subject in MS is generally shown


from the waist up; background begins to be visible and potentially
important, and two-shots are possible.

Medium Long shot (MLS): Human subject is shown from the


knees up. Also called an American Shot because Hollywood
movies of the Thirties and Forties used it so often for dramatic
action.

Long shot (LS): The camera is a considerable distance from the


subject(s) being filmed. The whole human figure from head to feet
is included in the frame, with the surrounding environment very
visible.

Extreme long shot (ELS): The camera is very far away from the
subject, giving us a broad perspective. Often used to create an
"establishing shot," setting up a new scene.

Camera Angle: The position of the camera (in terms of height


from the ground) in relation to the subject being filmed.

Low-Angle Shot (LA): The camera is positioned below the


subject, and shoots upward at it. The effect is to make the subject
look dominating, powerful, as if a child were looking up at an
adult. An extreme low angle (ELA) would be an extreme variant.

Eye-Level Shot: The camera is located at normal eye level (five to


six feet from ground level) in relation to the subject. Unless
otherwise noted in the script, the camera will automatically be set
up at eye level. When analyzing a scene, eye-level shots do not
need to be indicated as such; the reader will assume that this is the
position of the camera, unless otherwise indicated.

High-Angle Shot (HA): The camera is positioned somewhere


above the subject and shoots down at it. An extreme high angle
would be an extreme variant. In a bird's eye shot the camera is
placed directly over the subject.

Dutch or Oblique Angle Shot: The camera is tilted so that on


screen, the horizon appears to be tilted. Often used as a subjective
shot to indicate stress, such as when a character is drunk or
drugged.

Two-Shot: Medium or medium-long shot of two characters.


Three-Shot: Medium or medium-long shot of three characters.

Moving Shot: Produced when the camera moves. When the camera
remains fixed but swivels horizontal-ly, it is called a pan;
when it swivels vertically, it is a tilt. When the camera
itself travels horizontally, it is a tracking shot. When the
camera travels in closer to a subject or away from a subject,
it is called a dolly shot. When the camera travels vertically,
it is a crane or boom shot. More info:

Crane Shot: Shot taken from a crane or boom (a sort of huge


mechanical arm, which carries the camera and cameraman,
and can move in virtually any direction--vertically,
forward-backward, transversely, or in a combination of the
above.

Tracking Shot: The camera is mounted on a dolly or truck, and moves


horizontally on wheels or railroad-like tracks to follow the
action being filmed or to survey the setting.

Dolly Shot: The camera is mounted on a dolly and moves forward


(dolly-in) or away from (dolly-out) the subject.

Hand-Held Shot: The camera operator carries the camera while filming the
action; this has become possible over the last thirty years
with the invention of lighter cameras. Can be used with a
"Steadicam" system, a hydraulic harness device that allows
the movement to be kept very smooth, almost as smooth as
a dolly or crane shot. Usually, however, hand-held shots
are used for their lack of smoothness, to give the
impression of the point of view of a person walking--for
greater naturalism or to create suspense.
Zoom Shot: Technically not a moving shot because the camera itself
does not move, the zoom is made by the zoom lens, which
has variable focal length (see Section IV). The zoom
became a popular technique in the Sixties. On screen a
zoom-in resembles a dolly-in, but its telephoto optics as it
moves in on the subject differ from the more realistic,
dynamic look that a dolly or hand-held shot retains.

Pan Shot: The view sweeps from left to right or from right to left.
Differs from the tracking shot in that the camera is not
mounted on a movable object but stays fixed. It pans on a
horizontal axis (short for "panorama"). In a Flash Pan or
Zip Pan the movement is very rapid, so that the filmed
action on the screen appears as only a blurred movement.

Tilt: Like a pan, but the camera tilts up or down along a vertical
plane.

Stock Shot: A shot "borrowed" from the archives of a studio.


Generally, this would be a shot made for another film,
frequently a documentary--e.g., the New York Skyline, the
White House, a WWII naval battle scene.

Subjective Shot: The camera is positioned at an angle, or has something


about its content (distortion through misfocus or strange
color, etc.) to suggest that the shot is seen from the
viewpoint of a particular character in the film, usually a
character in an abnormal frame of mind (e.g., through
drunkenness, or fear, or heightened sensitivity).

Long Take: A shot that lasts a long time (as distinguished from a long
shot, where "long" refers to camera distance.
Mise-En-Scene: A theoretical term coming from the French, meaning, more
or less, "staging." In general, concerns everything within a
shot as opposed to the editing of shots; includes camera
movement, set design, props, direction of the actors,
composition of formal elements within the frame, lighting,
and so on. In film theory Mise-En-Scene is one of the two
major categories of film analysis; "Montage" (Editing) is
the other.

II. VOCABULARY RELATING TO EDITING AND SEQUENCE CONSTRUCTION

Cross-Cutting: Cutting back and forth between shots from two(or more)
scenes or locales. This alternation suggests that both
actions are occurring simultaneously.

Cut: The most immediate, and common, of transitions from shot to


shot. It is effected in the laboratory simply by splicing one shot
onto another. On screen the appearance of the second shot
immediately replaces the first. "To cut" also means to edit; in
addition, during filming "to cut" means to stop the camera.

Editing: The joining together of shots to make a sequence or a film. This


also includes the process of matching the soundtrack and the
visuals. The European word for editing is montage, which has
become the critical term for editing.

Establishing Shot: Also called a master shot. A long shot usually at the beginning of
a scene, to establish the spatial relationships of the characters, actions, and spaces
depicted in subsequent closer shots.

Insert: A shot of a static object, such as a book, letter, clock, murder


weapon, pile of cash, inserted during the editing process, generally
between two shots of a character looking offscreen, usually to
indicate that this is what s/he is looking at.

Jump Cut: A break or cut in a shot's temporal continuity, caused by removing


a section of a shot and then splicing together what remains of it.
On screen the result is abrupt and jerky; in certain films it is
deliberate. Also, a jump cut is a transition indicating a break in
temporal continuity between two adjacent shots. For example, a
shot of a character opening a car door followed by a shot of him
driving the car; we don't see the character actually getting into the
car, starting the motor, beginning to drive. The term is also used to
indicate an abrupt and unexpected shift in locale.

Match Cut: A transition that involves a direct cut from one shot to
another, which "matches" it in action, subject matter, or
actual composition. This kind of transition is commonly
used to follow a character as s/he moves or appears to
move continuously. Film continuity is often dependent on
match cutting. Match cutting can also be used in a montage
sequence, to show a similar activity occurring over a
passage of time.

Montage: (1) Editing; putting together shots and creating a "film reality."

(2) a short, impressionistic sequence used to show either the


passage of time or an accumulation of objects or events
used descriptively.

(3) In critical terms, montage is often opposed to mise-en-


scene, to refer to the creation of a film reality through
piecing together fragments of reality (or shots). Montage is
all that happens between shots. A filmmaker who stresses
this tendency (i.e., using much editing) has a montage style;
a filmmaker who tends not to cut--who favors long takes,
lots of camera movement, etc.--is considered a mise-en-
scene director.

Parallel Editing: Same as cross-cutting.


Reaction Shot: A shot showing the reaction of a character to something or
someoneseen in the previous shot.

Reverse Angle Shot: In filming conversations, an alternation or cross-cutting of


shots filmed from an over-the-shoulder position of each
character in turn is reverse angle shooting. Each shot shows
the face of one character and the back of head and shoulders
of the other.

Scene: A portion of the film in which all of the action occurs in the
same place and in the same time span. A scene may be
composed of any number of shots.

Sequence: Any section of a film that is self-contained enough to be


intelligible when viewed apart from the rest of the film.
Unlike a scene, it can consist of action occurring in various
places and at different times.

Splice: The physical point at which two shots are joined by glue or
tape during editing. A machine called a splicer aids in
creating a splice.

III. TRANSITIONS

Burn in: Gradually going from a white screen to a an image.

Burn out: Gradually going from an image to a white screen.

Cut: The most immediate, and common, of transitions from shot


to shot. On screen the appearance of the second shot
immediately replaces the first. The cut is increasingly
being used as a transition between sequences as well
(traditionally the fade and the dissolve have been used for
this purpose).

Dissolve: The end of one shot merges slowly into the beginning of
the next; as the second shot becomes increasingly distinct,
the first slowly disappears. Traditional way of moving
from sequence to sequence.

Fade-in: Slow brightening of the picture from a black screen to


normal brightness. Suggests passage of time.

Fade-out: Reverse of the fade-in. The shot gradually darkens to


blackness, usually signalling the ending of a sequence.

Iris-In: A shot that opens from darkness in an expanding circle of


image, as if a circular window were opening on the image.
Frequently used in the silent cinema.

Iris-Out: The opposite of an iris-in, ending with a shot with a


progressively narrower iris.

Jump Cut: See Section II above.

Match Cut: See Section II above.


Wipe: Transition from one shot to the next, in which the second
appears and wipes or pushes off the first, looking kind of
like a windshield wiper.

IV. VOCABULARY RELATING TO PHOTOGRAPHIC AND TECHNICAL


PROPERTIES OF FILM

Aspect Ratio: The proportions of the frame, the ratio of the width to the
height of the image area. The traditional aspect ratio for 35
mm. film is 1.33:1 and is known as Academy Aperture.
For wide-screen processes such as Cinemascope, the aspect
ratio may range from 1.65:1 to 2.55:1. All film gauges are
wider than they are high, a factor affecting formal
composition within the frame.

Depth of Field: The degree to which an image is in sharp focus in depth


(usually a function of the size of the camera lens opening).
In shallow focus (shallow depth of field), a very narrow
zone of depth is in focus at any one time (foreground or
midground or background), and everything closer and
further from the camera is out of focus. In deep focus all
distance planes (foreground, midground, and background)
remain clearly in focus, from close-up range to infinity.

Film Stock: The "raw," unexposed film that is loaded into the camera
for shooting. Film stock can be color or black-and-white,
"fast" or "slow."

Focus: The degree of sharpness and clarity in a film image. "Out


of focus": the images are blurred and lack linear definition.

Footage: Exposed film stock.


Frame: An individual image on a strip of film. In silent films
frames were projected at the rate of 16 frames per second;
in sound film they are projected at the rate of 24 frames per
second.

Lenses of the Movie Camera:

Wide-Angle: A lens with short focal length, having a wider


than normal field of view. Has the effect of appearing to
expand the depth of the image, and can cause visual
distortion when the subject is close to the camera.

Normal

Telephoto: A lens with a long focal length, which gives a


narrower than normal field of view, and compresses depth
in space, appearing to bring distant subjects nearer, and
giving the image a flattened effect (opposite of wide-angle
photography).

Zoom: A composite lens that allows one to move from


wide-angle to normal to telephoto or the reverse. Makes it
possible to move toward or away from the subject without
moving the camera.

Optical Printer: An elaborate mechanical device used to create special


effects in a film print, such as fade-ins and fade-outs,
dissolves, superimpositions, and other effects. Much of
this work is now done through Computer Graphics
technology.
Overexposure: A shot brighter and more contrasty than normal, resulting
from too much light having entered the lens and reading the
film.

Rack Focus: When the zone of sharp focus changes from foreground to
background (or vice-versa) within a single shot. The
viewer's attention is thus drawn from one plane to another.

Soft Focus: The image is softened by diffusing the light and reducing
the sharpness of the lens.

Superimposition: When two different shots are printed onto the same strip of
film. Every dissolve contains a brief superimposition.

V. MISCELLANEOUS TERMS

Auteur: French for "author." The auteur theory was popularized by


the New Wave French critics of the Fifties and Sixties, and
emphasized the director as major creator of film art. A
strong director--an auteur--stamps her/his film with a
personal vision, often in spite of external impositions such
as producers and studio pressures.

Exterior: A scene apparently shot out of doors. The exterior may be


simulated in the studio or it may be filmed "on location."
Eyeline: The direction in which a character is looking. Eyelines are
often a way of letting us know what (or whom) a character
is interested in.

Genre: A recognizable type of film which depends on certain


established conventions and expectations. Common
American genres are the Western, the Gangster Film, the
Horror Film, the Musical, the Detective Film, and so on.
Examples of generic conventions would be the Gangster
Film's urban setting, fast cars, drinking, moll, pistols,
machine guns, flashy clothes.

Interior: An indoor scene, filmed either on a studio set or on


location.

Intertitles: Frames with written text, coming between image shots,


used in silent films to transmit necessary verbal
information, such as explanations or dialogue.

Lighting: The illumination of the set. Lighting may be described in


terms of the direction from which the light enters the set
(front-lighting, back-lighting, side-lighting, top-lighting,
cross-lighting). Lighting may also be described in terms of
the contrast between light and dark: High-key lighting (the
main or key lights produce a diffuse illumination, with few
shadows created); Chiaroscuro or low-key lighting (very
contrasty, with some parts of the set highlighted and the
rest in darkness; lots of shadows. Highlighting can also be
a means of emphasizing a character's hair or eyes.

Voice-Over: The voice of a narrator is heard, although the character


speaking is usually not presented visually. If the character
is visually present, there is no lip movement, a convention
indicating that we are hearing the character's thoughts.

3) Preparations for SYNC SOUND :

a) Introduction to SYNC SOUND.

Sync sound (synchronized sound recording) refers to sound recorded at the time of the
filming of movies, and has been widely used in U.S. movies since the birth of sound
movies. The first animated film in which sync sound was used is Walt Disney's
Steamboat Willie. The characters and the boat dance in time with the music, and the gags
are sound-related.

In Hong Kong, sync sound was not widely used until the 1990s, as the generally noisy
environment and lower production budgets made such a method impractical.

Most Bollywood films from the 60s onwards do not employ this technique and for that
very reason the recent film Lagaan was noted for its use of Sync sound. The common
practice in Bollywood is to 'dub' over the dialogues at the Post-Production Stage.
Although the very first Indian talkie Alam Ara released in 1931 saw the very first use of
Sync Sound in India, and since then Indian films were regularly shot in Sync Sound till
the 60's with the silent Mitchell Camera, with the arrival of the Arri 2C, a noisy but more
practical camera particularly for outdoor shoots, 'dubbing' became the norm and was
never reversed. [1]

Use of this technique has increased in recent times with development in film techniques
and instrumentations like Arri Blimp camera.This system has been used in Hindi movie
jodhaa akbar(2008).

b) Going on location : the very first step for sync sound for a sound designer
is to go on location with the director to find out how noisy or quite the
location is depending upon what he\she would be taking the decision of
whether to shoot at that location or not or which particular sequences to
dub
c) Choosing microphones/ recorder: depending upon number of characters,
location, dialogue overlapping, post pro formats (i.e. mono, stereo or
surround), sound designer is supposed to pick up all the required
equipment like a single or a multitrack location recorder like NAGRA or
PD – 6. If the sound engineer is doing the pilot (ref track), then a single
track recorder will do, otherwise he will be using a multitrack recorder
with shotgun mics and lapels for dialogues and room tones pick up and
special stereo microphones for ambiences and other effects recording.
Also depending upon the type of location, the sound engineer is supposed
to carry sufficient amount of pro quality cables, boom rods, windshields
ectc.

d) Introduction to different kind of mutitrack location recorders available like


Fostex PD-6, CANTAR, DEVA, NAGRA HDD recorder etc.

4) Making a Sound continuity sheet :

a) Introduction: A sound continuity sheet comes in different formats


depending upon the requirements of the recordist, it consists of various
parameters like scene, shot, take, NG (not good) or OK, description of the
shot and sound etc. Its highly recommended to maintain a sound script on
everyday basis while shooting the film because it helps a lot in
maintaining the log of each and every sound with its corresponding shot
which is helpful in every stage of sound production. A simple sound
continuity sheet looks like this :

S. NG Description of Description of Remarks


no. Scene Shot Take or OK shot sound
1. 1A 1 1 NG Close-up of Old man
old man crying
2. 1A 1 2 OK Close up of Old man Can be
old man crying used in
3. 1B 1 1 OK Old man track
getting up and Footsteps and laying
4. ------- ------- ------ ------- walking stick with
man coughing
5.

6.

7.

8.

9.

10.

5) Backing up your recorded data :

After the audiographer has recorded all the sounds in his recorder, he is supposed
to take the backup of all the data for post production either on a hard disk, DVD
ram, cd, flash drive whichever medium is easily available and efficient too. These
days digital hard disk location recorders like cantar , deva and PD-6 provide
internal hard disks along with DVD RAM and flash card support , so one can
simply copy his data on a flash card or DVD RAM and straightaway transfer it on
a DAW for track laying and post production. Here are some pictures of location
recorders available these days.

Top: DEVA 5.8 Bottom: CANTAR X

Transferring data: Depending upon which DAW (Pro tools or Nuendo) you are
using, sound is first transferred to the internal HDD of the machine you are
working on. Its highly recommended to keep the audio data in separate partition
and video in separate to avoid confusion and maintain efficiency as well. After
copying all the data to the hard drive, an audio session is created in the DAW
depending upon various attributes like sample rate, bit depth, time code and frame
rate to lock the video with the picture etc. If you have got the metadata
(information embedded inside the audio files like scene number, shot number,
sample rate, bit depth, number of channels, mono or stereo etc) then importing
files according to that also. After all the audio data has been imported and a
session is made, all the tracks are organized as Dialogues, Ambiences and FX to
avoid the confusion and further sound editing, cleaning etc. Video data is also
imported along with the pilot audio to sync all the tracks which is a telecine
version of actual shot data, we will be discussing about telecine and reverse
telecine later.
6) Learning about film formats :

Introduction to PAL and NTSC: In the 1950s, when the Western European countries were
planning to establish colour television, they were faced with the problem that the already
existing American NTSC standard wouldn't fit the 50 Hz AC frequency of the European
power grids. In addition to that NTSC demonstrated several weaknesses, including colour
tone shifting under less-than-ideal transmission conditions. For these reasons the
development of the SECAM and PAL standards began. The goal was to provide a colour
TV standard with a picture frequency of 50 fields per second (50 hertz), and sporting a
better colour picture than NTSC.

PAL was developed by Walter Bruch at Telefunken in Germany. The format was first
unveiled in 1963, with the first broadcasts beginning in the United Kingdom and
Germany in 1967.[1]

Telefunken was later bought by the French electronics manufacturer Thomson. Thomson
also bought the Compagnie Générale de Télévision where Henri de France developed
SECAM, historically the first European colour television standard. Thomson nowadays
also co-owns the RCA brand for consumer electronics products, which created the NTSC
colour TV standard before Thomson became involved.

The term "PAL" is often used informally to refer to a 625-line/50 Hz (576i, principally
European) television system, and to differentiate from a 525-line/60 Hz (480i, principally
North American/Central American/Japanese) "NTSC" system. Accordingly, DVDs are
labelled as either "PAL" or "NTSC" (referring informally to the line count and frame
rate) even though technically the European discs do not have PAL composite colour. This
usage may lead readers to believe that PAL defines image resolution, even though it
doesn't. The PAL colour system can be used in conjunction with any resolution and frame
rate, and various such combinations exist. NTSC, by contrast does define the video line
and frame format.

a) Introduction to Telecine and Reverse Telecine. :

Telecine : Telecine is the same root as in 'cinema'; also "tele-seen".) is the process of
transferring motion picture film into electronic form. The term is also used to refer to the
equipment used in the process.

Telecine enables a motion picture, captured originally on film, to be viewed with


standard video equipment, such as televisions, video cassette decks or computers. This
allows producers and distributors working in film to release their products on video and
allows producers to use video production equipment to complete their film projects. The
word “Telecine” is a combination of “television” and “cinema.” Within the film industry,
it is also referred to as a TK, as TC is already used to designate time code.

Reverse telecine : reverse telecine in simple words is the process of converting the
electronic form of motion picture into celluloid form i.e transferring the electronic form
to physical film.

7) Preparations for dubbing (ADR)

Introduction to dubbing : Automated dialogue replacement or Additional dialogue


recording (ADR) is a film sound technique involving the re-recording of dialogue after
photography, also known as "looping" or a looping session.[1] In the UK it is called post-
synchronization or post-sync.

In conventional film production, a production sound mixer records dialogue during


photography, but several uncontrollable issues, such as traffic or animal noise, during
principal photography can cause the production sound to be unusable. This is also true for
computer-generated imagery, since some of the "actors" were not actually present at the
set.

When the film is in post-production, a Supervising Sound Editor or ADR Supervisor


reviews all of the dialogue in the film and rules which actor lines will have to be replaced
using the ADR technique.

ADR is recorded during an ADR session. An actor, usually the original actor on set, is
called to a sound studio equipped with video playback equipment and sound playback
and recording equipment. The actor wears headphones and is shown the film of the line
that must be replaced, and often he or she will be played the production sound recording.
The film is then projected several times, and the actor attempts to re-perform the line
while watching the image on the screen, while an ADR Recordist records the
performances. Several takes are made, and based on the quality of the performance and
sync, one is selected and edited by an ADR Editor for use in the film.

Sometimes, a different actor is used from the actual actor on set. One famous example is
the Star Wars character Darth Vader, portrayed by David Prowse. In postproduction, his
voice was replaced with that of James Earl Jones. Many fans therefore think of Jones as
"the man who played Darth Vader" rather than Prowse.

There are variations of the ADR process. ADR does not have to be recorded in a studio,
but can be recorded on location, with mobile equipment; this process was pioneered by
Matthew Wood of Skywalker Sound for Star Wars Episode I: The Phantom Menace.
ADR can also be recorded without showing the actor the image they must match, but
only by having him listen to the performance. This process was used for years at
Universal Studios.

An alternative method, called "rythmo band” (or "lip-sync band") was historically used in
Canada and France. This band provides a more precise guide for the actors, directors and
technicians and can be used to complement the traditional headphone method. The band
is actually a clear 35 mm film leader on which is written, in India ink, the dialogue and
numerous indications for the actor (laughs, cries, length of syllables, mouth sounds,
mouth openings and closings, etc.). The lip-sync band is projected in studio and scrolls in
perfect synchronization with the picture. Thanks to the high efficiency of the lip-sync
band, the number of retakes are reduced, resulting in a substantial savings in recording
time (as much as 50% compared to headphones-only recording).

Historically, the preparation of the lip-sync band is a long, tedious and complex process
involving a series of specialists in an old fashioned manual production line. Until
recently, such constraints have prevented this technique from being adopted
internationally, particularly in the United States.

Advanced software technology has been able to digitally reproduce the rythmo-band
output in a fraction of the time. This technology is being adapted in all markets and is
proven to reduce the amount of studio time and number of takes required for actors to
achieve accurate synchronization.

Using the traditional ADR technique (headphones and video) actors can average 10-12
lines per hour. Using the newer digital rythmo-band technologies, actors can output from
35-50 lines per hour, and much more with experience. Studio output with multiple actors
can therefore reach 2-4 hundred lines per hour. dubStudio has pioneered the digital
rythmo-band technology and has been used by several large dubbing studios.

ADR can usually be used to redub singing. This technique was used by, among many
others, Billy Boyd and Viggo Mortensen in The Lord of the Rings.[citation needed]

Adding or replacing non-vocal sounds, such as sound effects, is the task of a foley
artist

8) Foley recording :

Intro to foley : Foley - is a term used to describe a process of creating sound effects for
enhancing a soundtrack of a film, video or multimedia work. The term foley is also used
to describe a place, such as foley-stage or foley-studio, where the foley process takes
place. "Foley" gets its name from Jack Donovan Foley, a sound editor at Universal
Studios.
Foley-studio, usually consists of one big theatre with a desk inside or at least two
separate rooms where at least one serves as a control booth or a recording room and the
other room is used as a foley-stage where the sound effects are actually created. The
separation of foley-stage and the recording room is very desirable as some foley effects,
such as smashing watermelons, can become very messy, however it works far more
effectively for communication purposes to have the room as one rather than separated, as
this allows the editor, mixer, and technician to be more involved in the creation process
such as mic positioning, prop- and surface-preparation.

The need of replacing or enhancing sounds in a film production arises from the fact that,
very often, the original sounds captured during shooting are obstructed by noise or not
convincing enough to underscore and supplement the visual effect or action. For
example, fist-fighting scenes in an action movie are usually staged by the stunt actors and
therefore, such scenes do not have any sounds that are normally associated with fist
fighting. It is therefore necessary to enhance such fist-fighting scenes with artificially
created sounds that mimic fist fighting.

People that work in a foley-studio are sometimes called foley artists or foley-technicians

Same process as dubbing but this time recording live FX while watching the visual and
taking care of sync and other factors as explained above.

9) Music recording and production for film :

Music production for a film is done by Music Producer or Music director who is
responsible for full musical soundtrack of the film. There can be a single person handling
all the music for the film or there are many composers working on a single film. Music
composer is the person who is responsible for creating melodies, Music
arranger\programmer is the person who handles all the musical arrangements like
keyboards, percussions and other instruments and puts them together to create a kind of
background for the main melody, Lyricist is the guy who writes lyrics for the song,
Music recordist\mixing engineer is the guy who does all the recording, editing, mixing of
the music tracks whether it’s a song or a background score. Background scoring is
different from songs as it only contains a musical arrangement without any lyrics. So, as
soon as the main theme or composition is composed, the music composer gives a dummy
or scratch composition to his arranger and arranger does all the musical parts to build the
complete song. These days musical arrangements are done mostly using MIDI music
technology. Now when music arrangements are ready, music director calls all the
musicians in a studio, if all of them are not available at the same day or time, then music
is recorded on a multitrack system parts by parts. After all the musical score and songs
are recorded, the music engineer sits with each and every track and edits and cleans it to
suit the film. Afterwards music is mixed and mastered either in stereo for audio CD’s or
in 5.1 surround for theatre releases.
10) Sound design :

What is Sound Design?


You may assume that it’s about fabricating neat sound effects. But that doesn’t describe
very accurately what Ben Burtt and Walter Murch, who invented the term, did on "Star
Wars" and "Apocalypse Now" respectively. On those films they found themselves
working with Directors who were not just looking for powerful sound effects to attach to
a structure that was already in place. By experimenting with sound, playing with sound
(and not just sound effects, but music and dialog as well) all through production and post
production what Francis Coppola, Walter Murch, George Lucas, and Ben Burtt found is
that sound began to shape the picture sometimes as much as the picture shaped the
sound. The result was very different from anything we had heard before. The films are
legends, and their soundtracks changed forever the way we think about film sound.

What passes for "great sound" in films today is too often merely loud sound. High
fidelity recordings of gunshots and explosions, and well fabricated alien creature
vocalizations do not constitute great sound design. A well-orchestrated and recorded
piece of musical score has minimal value if it hasn’t been integrated into the film as a
whole. Giving the actors plenty of things to say in every scene isn’t necessarily doing
them, their characters, or the movie a favor. Sound, musical and otherwise, has value
when it is part of a continuum, when it changes over time, has dynamics, and resonates
with other sound and with other sensory experiences.

What I propose is that the way for a filmmaker to take advantage of sound is not simply
to make it possible to record good sound on the set, or simply to hire a talented sound
designer/composer to fabricate sounds, but rather to design the film with sound in mind,
to allow sound’s contributions to influence creative decisions in the other crafts. Films as
different from "Star Wars" as "Citizen Kane," "Raging Bull," "Eraserhead," "The
Elephant Man," "Never Cry Wolf" and "Once Upon A Time In The West" were
thoroughly "sound designed," though no sound designer was credited on most of them.

Does every film want, or need, to be like Star Wars or Apocalypse Now? Absolutely
not. But lots of films could benefit from those models. Sidney Lumet said in an interview
that he had been amazed at what Francis Coppola and Walter Murch had been able to
accomplish in the mix of "Apocalypse Now." Well, what was great about that mix began
long before anybody got near a dubbing stage. In fact, it began with the script, and with
Coppola’s inclination to give the characters in "Apocalypse" the opportunity to listen to
the world around them.

Many directors who like to think they appreciate sound still have a pretty narrow idea of
the potential for sound in storytelling. The generally accepted view is that it’s useful to
have "good" sound in order to enhance the visuals and root the images in a kind of
temporal reality. But that isn’t collaboration, it’s slavery. And the product it yields is
bound to be less complex and interesting than it would be if sound could somehow be set
free to be an active player in the process. Only when each craft influences every other
craft does the movie begin to take on a life of it’s own.

A Thing Almost Alive


It is a common myth that the time for film makers to think seriously about sound is at the
end of the film making process, when the structure of the movie is already in place. After
all, how is the composer to know what kind of music to write unless he/she can examine
at least a rough assembly of the final product? For some films this approach is adequate.
Rarely, it works amazingly well. But doesn’t it seem odd that in this supposedly
collaborative medium, music and sound effects rarely have the opportunity to exert any
influence on the non-sound crafts? How is the Director supposed to know how to make
the film without having a plan for using music?
A dramatic film which really works is, in some senses, almost alive, a complex web of
elements which are interconnected, almost like living tissues, and which despite their
complexity work together to present a more-or-less coherent set of behaviors. It doesn’t
make any sense to set up a process in which the role of one craft, sound, is simply to
react, to follow, to be pre-empted from giving feedback to the system it is a part of.

The Basic Terrain, As It Is Now


Many feature film directors tend to oscillate between two wildly different states of
consciousness about sound in their movies. On one hand, they tend to ignore any serious
consideration of sound (including music) throughout the planning, shooting, and early
editing. Then they suddenly get a temporary dose of religion when they realize that there
are holes in the story, weak scenes, and bad edits to disguise. Now they develop
enormous and short-lived faith in the power and value of sound to make their movie
watchable. Unfortunately it’s usually way too late, and after some vain attempts to stop a
hemorrhage with a bandaid, the Director’s head drops, and sound cynicism rules again
until late in the next project’s post production.

What follows is a list of some of the bleak realities faced by those of us who work in
film sound, and some suggestions for improving the situation.

Pre-Production
If a script has lots of references in it to specific sounds, we might be tempted to jump to
the conclusion that it is a sound-friendly script. But this isn’t necessarily the case. The
degree to which sound is eventually able to participate in storytelling will be more
determined by the use of time, space, and point of view in the story than by how often the
script mentions actual sounds. Most of the great sound sequences in films are "pov"
sequences. The photography, the blocking of actors, the production design, art direction,
editing, and dialogue have been set up such that we, the audience, are experiencing the
action more or less through the point of view of one, or more, of the characters in the
sequence. Since what we see and hear is being filtered through their consciousness, what
they hear can give us lots of information about who they are and what they are feeling.
Figuring out how to use pov, as well as how to use acoustic space and the element of
time, should begin with the writer. Some writers naturally think in these terms, most
don’t. And it is almost never taught in film writing courses.

Serious consideration of the way sound will be used in the story is typically left up to the
director. Unfortunately, most directors have only the vaguest notions of how to use
sound because they haven’t been taught it either. In virtually all film schools sound is
taught as if it were simply a tedious and mystifying series of technical operations, a
necessary evil on the way to doing the fun stuff.

Production
On the set, virtually every aspect of the sound crew’s work is dominated by the needs of
the camera crew. The locations for shooting have been chosen by the Director, DP, and
Production Designer long before anyone concerned with sound has been hired. The sets
are typically built with little or no concern for, or even awareness of, the implications for
sound. The lights buzz, the generator truck is parked way too close. The floor or ground
could easily be padded to dull the sound of footsteps when feet aren’t in the shot, but
there isn’t enough time. The shots are usually composed, blocked, and lit with very little
effort toward helping either the location sound crew or the post production crew take
advantage of the range of dramatic potential inherent in the situation. In nearly all cases,
visual criteria determine which shots will be printed and used. Any moment not
containing something visually fascinating is quickly trimmed away.

There is rarely any discussion, for example, of what should be heard rather than seen. If
several of our characters are talking in a bar, maybe one of them should be over in a dark
corner. We hear his voice, but we don’t see him. He punctuates the few things he says
with the sound of a bottle he rolls back and forth on the table in front of him. Finally he
puts a note in the bottle and rolls it across the floor of the dark bar. It comes to a stop at
the feet of the characters we see. This approach could be played for comedy, drama, or
some of both as it might have been in "Once Upon A Time In The West." Either way,
sound is making a contribution. The use of sound will strongly influence the way the
scene is set up. Starving the eye will inevitably bring the ear, and therefore the
imagination, more into play.

Post Production
Finally, in post, sound cautiously creeps out of the closet and attempts meekly to assert
itself, usually in the form of a composer and a supervising sound editor. The composer is
given four or five weeks to produce seventy to ninety minutes of great music. The
supervising sound editor is given ten to fifteen weeks to—smooth out the production
dialog—spot, record, and edit ADR—and try to wedge a few specific sound effects into
sequences that were never designed to use them, being careful to cover every possible
option the Director might want because there "isn’t any time" for the Director to make
choices before the mix. Meanwhile, the film is being continuously re-edited. The Editor
and Director, desperately grasping for some way to improve what they have, are
meticulously making adjustments, mostly consisting of a few frames, which result in the
music, sound effects, and dialog editing departments having to spend a high percentage
of the precious time they have left trying to fix all the holes caused by new picture
changes.

The dismal environment surrounding the recording of ADR is in some ways symbolic of
the secondary role of sound. Everyone acknowledges that production dialog is almost
always superior in performance quality to ADR. Most directors and actors despise the
process of doing ADR. Everyone goes into ADR sessions assuming that the product will
be inferior to what was recorded on the set, except that it will be intelligible, whereas the
set recording (in most cases where ADR is needed) was covered with noise and/or is
distorted.

This lousy attitude about the possibility of getting anything wonderful out of an ADR
session turns, of course, into a self fulfilling prophecy. Essentially no effort is typically
put into giving the ADR recording experience the level of excitement, energy, and
exploration that characterized the film set when the cameras were rolling. The result is
that ADR performances almost always lack the "life" of the original. They’re more-or-
less in sync, and they’re intelligible. Why not record ADR on location, in real-world
places which will inspire the actors and provide realistic acoustics? That would be taking
ADR seriously. like so many other sound-centered activities in movies, ADR is treated as
basically a technical operation, to be gotten past as quickly and cheaply as possible.

Taking Sound Seriously


If your reaction to all this is "So, what do you expect, isn’t it a visual medium?" there
may be nothing I can say to change your mind. My opinion is that film is definitely not a
"visual medium." I think if you look closely at and listen to a dozen or so of the movies
you consider to be great, you will realize how important a role sound plays in many if not
most of them. It is even a little misleading to say "a role sound plays" because in fact
when a scene is really clicking, the visual and aural elements are working together so
well that it is nearly impossible to distinguish them. The suggestions I’m about to make
obviously do not apply to all films. There will never be a "formula" for making great
movies or great movie sound. Be that as it may........

Writing For Sound


Telling a film story, like telling any kind of story, is about creating connections
between characters, places, objects, experiences, and ideas. You try to invent a world
which is complex and many layered, like the real world. But unlike most of real life
(which tends to be badly written and edited), in a good film a set of themes emerge
which embody a clearly identifiable line or arc, which is the story.

It seems to me that one element of writing for movies stands above all others in terms
of making the eventual movie as "cinematic" as possible: establishing point of view.
The audience experiences the action through its identification with characters. The
writing needs to lay the ground work for setting up pov before the actors, cameras,
microphones, and editors come into play. Each of these can obviously enhance the
element of pov, but the script should contain the blueprint.
Let’s say we are writing a story about a guy who, as a boy, loved visiting his father at
the steel mill where he worked. The boy grows up and seems to be pretty happy with his
life as a lawyer, far from the mill. But he has troubling, ambiguous nightmares that
eventually lead him to go back to the town where he lived as a boy in an attempt to find
the source of the bad dreams.

The description above doesn’t say anything specific about the possible use of sound in
this story, but I have chosen basic story elements which hold vast potential for sound.
First, it will be natural to tell the story more-or-less through the pov of our central
character. But that’s not all. A steel mill gives us a huge palette for sound. Most
importantly, it is a place which we can manipulate to produce a set of sounds which
range from banal to exciting to frightening to weird to comforting to ugly to beautiful.
The place can therefore become a character, and have its own voice, with a range of
"emotions" and "moods." And the sounds of the mill can resonate with a wide variety
of elements elsewhere in the story. None of this good stuff is likely to happen unless we
write, shoot, and edit the story in a way that allows it to happen.

The element of dream in the story swings a door wide open to sound as a collaborator.
In a dream sequence we as film makers have even more latitude than usual to modulate
sound to serve our story, and to make connections between the sounds in the dream and
the sounds in the world for which the dream is supplying clues. Likewise, the "time
border" between the "little boy" period and the "grown-up" period offers us lots of
opportunities to compare and contrast the two worlds, and his perception of them. Over
a transition from one period to the other, one or more sounds can go through a
metamorphosis. Maybe as our guy daydreams about his childhood, the rhythmic clank of
a metal shear in the mill changes into the click clack of the railroad car taking him back
to his home town. Any sound, in itself, only has so much intrinsic appeal or value. On
the other hand, when a sound changes over time in response to elements in the larger
story, its power and richness grow exponentially.

Opening The Door For Sound, Efficient Dialog


Sadly, it is common for a director to come to me with a sequence composed of
unambiguous, unmysterious, and uninteresting shots of a location like a steel mill, and
then to tell me that this place has to be made sinister and fascinating with sound effects.
As icing on the cake, the sequence typically has wall-to-wall dialog which will make it
next to impossible to hear any of the sounds I desperately throw at the canvas.

In recent years there has been a trend, which may be in insidious influence of bad
television, toward non-stop dialog in films The wise old maxim that it’s better to say it
with action than words seems to have lost some ground. Quentin Tarantino has made
some excellent films which depend heavily on dialog, but he’s incorporated scenes
which use dialog sparsely as well.

There is a phenomenon in movie making that my friends and I sometimes call the
"100% theory." Each department-head on a film, unless otherwise instructed, tends to
assume that it is 100% his or her job to make the movie work. The result is often a
logjam of uncoordinated visual and aural product, each craft competing for attention, and
often adding up to little more than noise unless the director and editor do their jobs
extremely well.
Dialogue is one of the areas where this inclination toward density is at its worst. On top
of production dialog, the trend is to add as much ADR as can be wedged into a scene.
Eventually, all the space not occupied by actual words is filled with grunts, groans, and
breathing (supposedly in an effort to "keep the character alive"). Finally the track is
saved (sometimes) from being a self parody only by the fact that there is so much other
sound happening simultaneously that at least some of the added dialog is masked. If your
intention is to pack your film with wall-to-wall clever dialog, maybe you should consider
doing a play

Characters need to have the opportunity to listen.


When a character looks at an object, we the audience are looking at it, more-or-less
through his eyes. The way he reacts to seeing the object (or doesn’t react) can give us
vital information about who he is and how he fits into this situation. The same is true for
hearing. If there are no moments in which our character is allowed to hear the world
around him, then the audience is deprived of one important dimension of HIS life.

Picture and Sound as Collaborators


Sound effects can make a scene scary and interesting as hell, but they usually need a
little help from the visual end of things. For example, we may want to have a strange-
sounding machine running off-camera during a scene in order to add tension and
atmosphere. If there is at least a brief, fairly close shot of some machine which could be
making the sound, it will help me immensely to establish the sound. Over that shot we
can feature the sound, placing it firmly in the minds of the audience. Then we never have
to see it again, but every time the audience hears it, they will know what it is (even if it is
played very low under dialogue), and they will make all the appropriate associations,
including a sense of the geography of the place.

The contrast between a sound heard at a distance, and that same sound heard close-up
can be a very powerful element. If our guy and an old friend are walking toward the mill,
and they hear, from several blocks away, the sounds of the machines filling the
neighborhood, there will be a powerful contrast when they arrive at the mill gate. As a
former production sound mixer, if a director had ever told me that a scene was to be shot
a few blocks away from the mill set in order to establish how powerfully the sounds of
the mill hit the surrounding neighborhood, I probably would have gone straight into a
coma after kissing his feet. Directors essentially never base their decisions about where
to shoot a scene on the need for sound to make a story contribution. Why not?

Art Direction and Sound as Collaborators


Let’s say we’re writing a character for a movie we’re making. This guy is out of
money, angry, desperate. We need, obviously, to design the place where he lives.
Maybe it’s a run-down apartment in the middle of a big city. The way that place looks
will tell us (the audience) enormous amounts about who the character is and how he is
feeling. And if we take sound into account when we do the visual design then we have
the potential for hearing through his ears this terrible place he inhabits. Maybe water and
sewage pipes are visible on the ceiling and walls. If we establish one of those pipes in a
close-up it will do wonders for the sound designer’s ability to create the sounds of stuff
running through and vibrating all the pipes. Without seeing the pipes we can still put
"pipe sounds" into the track, but it will be much more difficult to communicate to the
audience what those sounds are. One close-up of a pipe, accompanied by grotesque
sewage pipe sounds, is all we need to clearly tell the audience how sonically ugly this
place is. After that, we only need to hear those sounds and audience will make the
connection to the pipes without even having to show them.

It’s wonderful when a movie gives you the sense that you really know the places in it.
That each place is alive, has character and moods. A great actor will find ways to use the
place in which he finds himself in order to reveal more about the person he plays. We
need to hear the sounds that place makes in order to know it. We need to hear the actor’s
voice reverberating there. And when he is quiet we need to hear the way that place will
be without him.

Starving The Eye, The Usefulness Of Ambiguity


Viewers/listeners are pulled into a story mainly because they are led to believe that there
are interesting questions to be answered, and that they, the audience, may possess certain
insights useful in solving the puzzle. If this is true, then it follows that a crucial element
of storytelling is knowing what not to make immediately clear, and then devising
techniques that use the camera and microphone to seduce the audience with just enough
information to tease them into getting involved. It is as if our job is to hang interesting
little question marks in the air surrounding each scene, or to place pieces of cake on the
ground that seem to lead somewhere, though not in a straight line.
Sound may be the most powerful tool in the filmmaker’s arsenal in terms of its ability
to seduce. That’s because "sound," as the great sound editor Alan Splet once said, "is a
heart thing." We, the audience, interpret sound with our emotions, not our intellect.

Let’s assume we as film makers want to take sound seriously, and that the first issues
have already been addressed:

1) The desire exists to tell the story more-or-less through the point of view of one or
more of the characters.

2) Locations have been chosen, and sets designed which don’t rule out sound as a
player, and in fact, encourage it.

3) There is not non-stop dialog.

Here are some ways to tease the eye, and thereby invite the ear to the party:

The Beauty of Long Lenses and Short Lenses


There is something odd about looking through a very long lens or a very short lens. We
see things in a way we don’t ordinarily see them. The inference is often that we are
looking through someone else’s eyes. In the opening sequence of "The Conversation" we
see people in San Franciscoís Union Square through a telephoto lens. The lack of depth
of field and other characteristics of that kind of lens puts us into a very subjective space.
As a result, we can easily justify hearing sounds which may have very little to do with
what we see in the frame, and more to do with the way the person ostensibly looking
through that lens FEELS. The way we use such a shot will determine whether that
inference is made obvious to the audience, or kept subliminal.

Dutch Angles and Moving Cameras


The shot may be from floor level or ceiling level. The frame may be rotated a few
degrees off vertical. The camera may be on a track, hand held, or just panning. In any of
these cases the effect will be to put the audience in unfamiliar space. The shot will no
longer simply be "depicting" the scene. The shot becomes part of the scene. The element
of unfamiliar space suddenly swings the door wide-open to sound.

Darkness Around the Edge Of the Frame


In many of the great film noir classics the frame was carefully composed with areas of
darkness. Though we in the audience may not consciously consider what inhabits those
dark splotches, they nevertheless get the point across that the truth, lurking somewhere
just outside the frame is too complex to let itself be photographed easily. Don’t forget
that the ears are the guardians of sleep. They tell us what we need to know about the
darkness, and will gladly supply some clues about what’s going on.

Extreme Close-ups and Long Shots


Very close shots of peopleís hands, their clothing, etc. will tend to make us feel as
though we are experiencing things through the point of view of either the person being
photographed or the person whose view of them we are sharing. Extreme long shots are
wonderful for sound because they provide an opportunity to hear the fullness or
emptiness of a vast landscape. Carroll Ballards films The Black Stallion and Never Cry
Wolf use wide shots and extreme close-ups wonderfully with sound.

Slow Motion
Raging Bull and Taxi Driver contain some obvious, and some very subtle uses of slow
motion. Some of it is barely perceptible. But it always seems to put us into a dream-
space, and tell us that something odd, and not very wholesome, is happening.

Black and White Images


Many still photographers feel that black and white images have several artistic
advantages over color. Among them, that black and white shots are often less "busy"
than color images, and therefore lend themselves more to presenting a coherent feeling.
We are surrounded in our everyday lives by color and color images. A black and white
image now is clearly "understood" (felt) to be someone’s point of view, not an
"objective" presentation of events. In movies, like still photography, painting, fiction,
and poetry, the artist tends to be most concerned with communicating feelings rather than
"information." Black and white images have the potential to convey a maximum of
feeling without the "clutter" of color.

Whenever we as an audience are put into a visual "space" in which we are encouraged
to "feel" rather than "think," what comes into our ears can inform those feelings and
magnify them.

What Do All Of These Visual Approaches Have In Common?


They all are ways of withholding information. They muddy the waters a little. When
done well, the result will be the following implication: Gee folks, if we could be more
explicit about what is going on here we sure would, but it is so damned mysterious that
even we, the storytellers, don’t fully understand how amazing it is. Maybe you can help
us take it a little farther." That message is the bait. Dangle it in front of an audience and
they won’t be able to resist going for it. in the process of going for it they bring their
imaginations and experiences with them, making your story suddenly become their story.
success.

We, the film makers, are all sitting around a table in pre-production, brainstorming
about how to manufacture the most delectable bait possible, and how to make it seem like
it isn’t bait at all. (Aren’t the most interesting stories always told by guys who have to be
begged to tell them?) We know that we want to sometimes use the camera to withhold
information, to tease, or to put it more bluntly: to seduce. The most compelling method
of seduction is inevitably going to involve sound as well.

Ideally, the unconscious dialog in the minds of the audience should be something like:
"What I’m seeing isn’t giving me enough information. What I’m hearing is ambiguous,
too. But the combination of the two seems to be pointing in the direction of a vaguely
familiar container into which I can pour my experience and make something I never
before quite imagined." Isn’t it obvious that the microphone plays just as important a role
in setting up this performance as does the camera?

Editing Picture With Sound In Mind


One of the many things a film editor does is to get rid of moments in the film in which
"nothing" is happening. A desirable objective most of the time, but not always. The
editor and director need to be able to figure out when it will be useful to linger on a shot
after the dialog is finished, or before it begins. To stay around after the obvious "action"
is past, so that we can listen. Of course it helps quite a bit if the scene has been shot with
these useful pauses in mind. Into these little pauses sound can creep on it’s stealthy little
toes, or its clanking jackboots, to tell us something about where we have been or where
we are going.

Walter Murch, film editor and sound designer, uses lots of unconventional techniques.
One of them is to spend a certain period of his picture editing time not listening to the
sound at all. He watches and edits the visual images without hearing the sync sound
which was recorded as those images were photographed. This approach can ironically be
a great boon to the use of sound in the movie. If the editor can imagine the sound
(musical or otherwise) which might eventually accompany a scene, rather than listen to
the rough, dis-continuous, often annoying sync track, then the cutting will be more likely
to leave room for those beats in which sound other than dialog will eventually make its
contribution.

Sound’s Talents
Music, dialogue, and sound effects can each do any of the following jobs, and many
more:

suggest a mood, evoke a feeling


set a pace
indicate a geographical locale
indicate a historical period
clarify the plot
define a character
connect otherwise unconnected ideas, characters, places,
images, or moments
heighten realism or diminish it
heighten ambiguity or diminish it
draw attention to a detail, or away from it
indicate changes in time
smooth otherwise abrupt changes between shots or scenes
emphasize a transition for dramatic effect
describe an acoustic space
startle or soothe

exaggerate action or mediate it

At any given moment in a film, sound is likely to be doing several of these things at
once.

But sound, if it’s any good, also has a life of its own, beyond these utilitarian functions.
And its ability to be good and useful to the story, and powerful, beautiful and alive will
be determined by the state of the ocean in which it swims, the film. Try as you may to
paste sound onto a predetermined structure, the result will almost always fall short of
your hopes. But if you encourage the sounds of the characters, the things, and the places
in your film to inform your decisions in all the other film crafts, then your movie may just
grow to have a voice beyond anything you might have dreamed.

So, what does a sound designer do?


It was the dream of Walter Murch and others in the wildly creative early days of
American Zoetrope that sound would be taken as seriously as image. They thought that
at least some films could use the guidance of someone well-schooled in the art of sound
in storytelling to not only create sounds but also to coordinate the use of sound in the
film. This someone, they thought, would brainstorm with the director and writer in pre-
production to integrate sound into the story on the page. During shooting that person
would make sure that the recording and playing-back of sound on the set was given the
important status it deserves, and not treated as a low-priority, which is always the
temptation in the heat of trying to make the daily quota of shots. In post production that
person would continue the fabrication and collection of sounds begun in pre-production,
and would work with other sound professionals (composers, editors, mixers), and the
Director and Editor to give the film’s soundtrack a coherent and well coordinated
feeling.

This dream has been a difficult one to realize, and in fact has made little headway since
the early 1970s. The term sound designer has come to be associated simply with using
specialized equipment to make "special" sound effects. On "THX-1138" and "The
Conversation" Walter Murch was the Sound Designer in the fullest sense of the word.
The fact hat he was also a Picture Editor on "The Conversation" and "Apocalypse Now"
put him in a position to shape those films in ways that allowed them to use sound in an
organic and powerful way. No other sound designers on major American films have had
that kind of opportunity.

So, the dream of giving sound equal status to image is deferred. Someday the Industry
may appreciate and foster the model established by Murch. Until then, whether you cut
the dialog, write the script, record music, perform foley, edit the film, direct the film or
do any one of a hundred other jobs, anybody who shapes sound, edits sound, or even
considers sound when making a creative decision in another craft is, at least in a limited
sense, designing sound for the movie, and designing the movie for sound.

11) Re-recording (RR) or Final mix of the film :

Re recording is also called final mix of the film. Re recording is actually the last stage of
a film sound production. When everything is done including track laying, sound design,
music scoring etc. All the data comes to a final mix engineer who mixes and masters the
film in either stereo, DTS, Dolby digital 5.1 surround etc whichever format the
producer\director wants to gets his audio done. Lets assume that the film is going to be
mixed in dolby digital 5.1 surround, so the basic preparations includes importing all the
data e.g dialogues, music, sfx, ambiences etc into different sessions and premixing them
in a single 5.1 stem (in simple words a group) . like there will be a separate single stem of
dialogues, music, fx and ambiences. Now these individual stems are imported in a master
session where everything is mixed properly and the mix is taken in a final master 5.1
stem which is then taken on a MOD (Magnetic Optical Disk) for mastering. Then this
final mastered MOD is sent for optical transfer on a sound negative which later is
processed with picture negative to get a positive release print which is called Married
print of the film which we see in theatres.
Post Production Workflow

It takes many hours of footage to make a typical two-hour movie. Many scenes will be
eliminated, or cut, in order to get the very best shots possible. Sound is recorded in
various locations and using different formats before, during, and after the filming and
special effects are added after filming, too.

The post production workflow revolves around combining, or mixing, all these pieces
into one polished, beautifully executed picture. The post production workflow may take
longer than the actual filming process did.

Other motion picture productions – television shows and commercials, corporate and
music videos – follow the same basic post production workflow although on a smaller
scale.

It is during the post production workflow that the film editor works with the director to
pick and choose the best scenes possible or when to reshoot if necessary. Special effects
are added during the post production workflow.

The audio aspects of post production workflow become mixed with the video imagery
during this time. Sound and image are linked and reviewed to make sure the timing is
appropriate and that sound quality and volume are compatible with the filmed sequences.

The music director may have been at work on the score during the entire time of filming
but it's during the post production workflow that the score and the story come together.
Three aspects of sound are combined at this stage of production – the dialog, score, and
special sound and Foley effects. The precise timing required to mix the sound of these
three tracks into the visual imagery is a meticulous and time-consuming process but it is
vital to get this mixed as flawlessly as possible.

Once a final cut has been developed, the post production workflow turns to a larger
audience for feedback. The production will be screened by a targeted audience and
further shooting or editing will be made based upon feedback given by the target
audience.

Distribution and marketing are the final stages in the post production workflow involved
with making a motion picture. In the case of a movie, a premiere showing and party is
often a major publicity event accompanied by fanfare, press kits,posters, TV advertising,
reviews, and usually a website.

The very last stage in the post production workflow is the release of the production in the
form of DVD for sale to the general public. The DVD release often comes several months
after the movie's cinematic release.

Some typical post production workflows are given below.

You might also like