US20120173980A1 - System And Method For Web Based Collaboration Using Digital Media - Google Patents

System And Method For Web Based Collaboration Using Digital Media Download PDF

Info

Publication number
US20120173980A1
US20120173980A1 US13/177,500 US201113177500A US2012173980A1 US 20120173980 A1 US20120173980 A1 US 20120173980A1 US 201113177500 A US201113177500 A US 201113177500A US 2012173980 A1 US2012173980 A1 US 2012173980A1
Authority
US
United States
Prior art keywords
tool
scene
script
production
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/177,500
Inventor
Eric B. Dachs
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US13/177,500 priority Critical patent/US20120173980A1/en
Publication of US20120173980A1 publication Critical patent/US20120173980A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/327Table of contents
    • G11B27/329Table of contents on a disc [VTOC]
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment

Definitions

  • the subject matter described herein relates to a system and method for web-based collaboration and project management using digital media over a network.
  • Motion pictures or other content such as movies, short films, television shows, commercials and music videos, are produced over an extended time and involves a vast amount of communication, time and collaboration to reach a final product.
  • the emergence and improvements in computer hardware and software as well as the commercial viability of the Internet and quick upload and download times have provided the ability for people around the world to communicate with one another.
  • the system and method allows production team members to view, organize, select and collaborate on uploaded shots and takes for each scene as well as communicate regarding uploaded audio, music or special effects that are to be applied to the motion picture.
  • the system also tracks versions of the project by monitoring each change and allows production members to organize project calendars, tasks associated with projects, contacts, notes, budgeting and other facets of the project.
  • FIG. 1 illustrates a block diagram of the system in accordance with an embodiment.
  • FIG. 2 illustrates a schematic diagram of the system architecture in accordance with an embodiment.
  • FIG. 3 illustrates a screen shot of a script tool of the system in accordance with an embodiment.
  • FIG. 4 illustrates a schematic of the script tool of the system in accordance with an embodiment.
  • FIG. 5 illustrates a screen shot of the script tool of the system in accordance with an embodiment.
  • FIG. 6 illustrates a screen shot of a viewer tool of the system in accordance with an embodiment.
  • FIG. 7 illustrates a block diagram of a production tracking tool of the system in accordance with an embodiment.
  • FIG. 8 illustrates a block diagram of an administrator tool of the system in accordance with an embodiment.
  • the components, process steps, and/or data structures described herein may be implemented using various types of operating systems, computing platforms, computer programs, and/or general purpose machines.
  • devices of a less general purpose nature such as hardwired devices, field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), or the like, may also be used without departing from the scope and spirit of the inventive concepts disclosed herein.
  • FPGAs field programmable gate arrays
  • ASICs application specific integrated circuits
  • a method comprising a series of process steps is implemented by a computer or a machine and those process steps can be stored as a series of instructions readable by the machine, they may be stored on a tangible medium such as a computer memory device (e.g., ROM (Read Only Memory), PROM (Programmable Read Only Memory), EEPROM (Electrically Eraseable Programmable Read Only Memory), FLASH Memory, Jump Drive, and the like), magnetic storage medium (e.g., tape, magnetic disk drive, and the like), optical storage medium (e.g., CD-ROM, DVD-ROM, paper card, paper tape and the like) and other types of program memory.
  • ROM Read Only Memory
  • PROM Programmable Read Only Memory
  • EEPROM Electrically Eraseable Programmable Read Only Memory
  • FLASH Memory Jump Drive
  • magnetic storage medium e.g., tape, magnetic disk drive, and the like
  • optical storage medium e.g., CD-ROM, DVD-ROM, paper card, paper tape and the like
  • motion picture is used herein to generally describe the digital content which is organized and utilized in producing a finished work, whether the finished work is a full length movie, short film, television show, commercial, music video, video game, skit, play, performance, promotional work, animated work, and/or other content involving a video component.
  • the process of making a motion picture involves a substantial amount of work in the pre-production, production, and post-production stages.
  • a script may be generated, whereby the script serves as the backbone or structure upon which the motion picture is based.
  • the script may include, among other things, concepts, designs, wardrobes, settings, locations, characters, dialogue, sequences and most importantly, the story.
  • pre-production involves a series of sketches or stills which serve to provide an overall look and feel of what each scene will generally look like in the motion picture. It is from the script that the scenes are set up, slated and shot. Although the script may slightly change during the course of production of a motion picture, the script serves as the backbone of the film. Typically, the producers, directors, editors as well as the rest of the production team strictly follow the script from beginning to the end in making the motion picture. It is during the pre-production phase that the film is “set up”, and a substantial amount of collaboration is present between members of the production team to get the project ready to move ahead to the production phase.
  • the production phase of the project usually takes the longest time to complete in the course of making the motion picture.
  • the production phase includes setting up the scenes; slating the scenes; and shooting the scenes, many times with multiple takes. It is during the production phase that the film is “formed.” Thus, the members of the production team continue to collaborate in getting the project onto the final phase of the project; post-production.
  • the post-production process includes further editing, applying voice-overs to dialogue in the scenes; adding video and audio effects; adding music sequences; applying color correction and lighting; and final editing.
  • the decision as to which takes and shots are to be in the final product occurs in the production stage as well as the post-production stage. After the motion picture is released or broadcast to be viewed by the audience, follow up actions are taken such as providing bonus features and commentaries for DVDs and other after market products.
  • the system and method is directed to a powerful and robust tool which allows members of the production team to quickly and efficiently perform the tasks necessary in pre-production, production and post-production.
  • the system can be used for a project from inception of the script to the final editing phase.
  • the system may be used to collaborate and manage only one project at a time or multiple projects occurring simultaneously.
  • the system allows production team members to view and organize a script; extract the script into scenes and slates; view and collaborate on shots and takes for each scene as well as communicate to one another regarding audio, music or special effects applied to the motion picture.
  • the system tracks versions of the project by tracking each change as well.
  • the system also allows production member to organize project calendars, tasks associated with projects, contacts, notes, budgeting and other facets of the project.
  • FIG. 1 illustrates a block diagram of the system in accordance with an embodiment.
  • the system 100 includes a script tool 102 , an organizer tool 104 , a production tracking tool 106 , an administrator tool 108 , a viewer tool 110 , a spotting tool 112 , a version tool 114 , and a budget tool 116 .
  • the system 100 may include additional and/or other components which would be useful in the production of a motion picture project, such as, for example, an editing tool.
  • the script tool 102 organizes and categorizes the script into sections upon which the motion picture is organized.
  • the organizer tool 104 is a tool which categorizes and organizes the script as well as stored media files in a manner which the scenes, slates, takes and shots are based.
  • the production tracking tool 106 enables management of one or more pending projects by providing the user with the necessary information to track the progress of the project and evaluate yet to-be-completed items.
  • the administrator tool 108 provides the administrative functions which are used to create and maintain user accounts, set up and manage access and security, generate reports and manage files.
  • the viewer tool 110 provides a user interface which displays video and audio as well as allows collaboration and communication among production team members.
  • the spotting tool 112 allows dialogue, voice-overs, sound effects and music to be added to the chosen scenes, takes and shots.
  • the version tool 114 and budget tool 116 provide version and budget information of items in a project.
  • FIG. 2 illustrates an overall architecture in which the system 100 in accordance with an embodiment.
  • the system is configured to allow one or more production team members at multiple remote locations to input, view, and modify (if given permission) data as well as collaborate with others in the production of the motion picture.
  • the various tools of the system are software modules which allow the user to manage the features of the system 100 .
  • the system 100 is modular and is created using an object oriented programming language to allow easy and efficient system modifications and updates.
  • LAN local area network
  • WLAN wireless local area network
  • the system software resides on the server 204 in an embodiment.
  • data related to the operation of the system is stored on memory storage area modules 206 which are located on the server 204 and/or remote from the server 204 .
  • data files relating to the content are uploaded onto the server 204 and stored in the memory 206 .
  • data files relating to the content are uploaded and kept on a computer terminal 202 B at the production studio (e.g. editor's studio), whereby the data files are accessed through the terminal 202 B at the studio.
  • the system can be used offline on a single computer or can be used by multiple users on multiple systems over a private or public network.
  • the system can operate in a decentralized fashion utilizing an ad-hoc network of peer computer systems. In this fashion, content and media can be delivered peer-to-peer by one client to another.
  • No one particular computer system may contain all project information, but collectively all information is retained on one or more computers. A degree of redundancy can be included so that project and content availability is not impacted by changing network or computer conditions.
  • Point to point communication may be encrypted over a Virtual Private Network (VPN) with private addressing on both ends.
  • VPN Virtual Private Network
  • the tools may be isolated into individual domains to enhance security.
  • the system may include one or more firewall and packet filters for enhanced security.
  • the system 100 may include security features which prevent tampering or unauthorized viewing of the content of the project in an embodiment.
  • each person using the system is assigned a security clearance rating which gives the person access to some or all of the features and/or content in the system, depending on his or her rating.
  • one or more of the system's interactive features e.g. script tool, production tool, etc.
  • content which is uploaded or already stored in the system may have a same or different access rating assigned to it to allow viewing of the content to those who have security clearance of that assigned rating or higher.
  • the entire content may receive one access rating or each file containing content may be selectively assigned same or different access ratings.
  • the administrator or person having a predetermined security clearance
  • the security feature tracks and analyzes each individual tool's operations and monitors system performance to ensure that the system is not hacked.
  • the system is accessed via the Internet in an embodiment, whereby multiple users may access same or different features of the system 100 at the same time or different times.
  • terminals 202 A and 202 G are shown accessing the administrator tool 108
  • terminal 202 C is shown accessing the viewer tool 110
  • terminal 202 D is shown accessing the script tool 102 and the viewer tool 110
  • terminals 202 E and 202 F are shown accessing the EFX tool 120 .
  • the system 100 updates all data input into the system automatically to allow any modifications, notes and/or messages to be seen by one or more users in real time when accessing the system at the same time.
  • the system thereby allows users or different groups of users to collaborate in real time on different levels simultaneously.
  • the system is alternatively a peer to peer network in which data is shared among computers which are not necessarily linked to one or more central servers.
  • the system 100 includes a script tool 102 in accordance with an embodiment.
  • the script tool 102 organizes and categorizes the script into sections upon which the motion picture is organized.
  • the script for a motion picture serves as the structure upon which the scenes, slates, locations, dialogue, camera angles, characters, and all other information upon which the motion picture is based and organized.
  • the script tool 102 may be a powerful feature of the system 100 , it is an optional tool. For example, the production team may use a paper version of the script and manually generate the scenes, set ups and slates in the script tool 102 which are eventually used by the system in producing the motion picture.
  • the script tool 102 allows the script to be imported into the system 100 .
  • the script may be on paper and then scanned into the system, whereby optical character recognition (OCR) software converts the scanned document into an appropriate format for the system 100 .
  • OCR optical character recognition
  • the imported script may be able to be edited in the organizer tool 104 using a word processing program.
  • the script is read-only not editable, and is thus placed on the system only for viewing.
  • the script is directly typed into the system using a word processing program. It is contemplated that the script for a particular project may evolve or portions be rewritten over the course of the project. Thus, the system may store multiple versions and/or drafts of the script for later viewing.
  • FIG. 3 illustrates a script imported into the script tool 102 of the system 100 .
  • a script in general includes dialogue, location settings, visual and/or audio descriptions of events as well as camera effects, character names, designation of visual and/or audio effects, and other information.
  • system 100 allows the user to link the script in the script tool 102 to any or all of the other tools of the system 100 .
  • the script tool 102 allows the user to view all the information associated with a selected portion of the script, from an entire scene to a particular word in the script.
  • the script tool 102 allows the user (e.g. the director, producer) to select and link any portion of the script to the organizer tool 104 and/or any other tools of the system 100 .
  • the portion of the script in FIG. 3 which states SERIES OF DISTORTED IMAGES 14 may be “marked” or assigned to be hyper-linked to the corresponding scene in the organizer tool 104 .
  • the system allows the user to move the cursor on the computer display screen to SERIES OF DISTORTED IMAGES and click on the marked phrase.
  • the system 100 will then automatically navigate the user to the script tool 102 , whereby the user will be able to view the script tool 102 and see the associated information which has been entered in regards to Scene 2 .
  • the system may allow the user to navigate from the script tool 102 to the viewer tool 110 to see all the shots associated with Scene 2 in the script bins as well as notes, comments or other information, discussed below.
  • the system 100 allows the user to navigate from the script tool 102 to the spotting tool 112 to listen to dialogue, sound effects, music score, or other audio which may be incorporated into the particular scene.
  • the user is able to individually click on the SUDDEN MOVEMENT 10 , BARELY AUDIBLE SOUNDS 12 and/or VOICE, LAUGHTER, PUBLIC ADDRESS ANNOUNCEMENTS 16 links in the script tool 102 shown in FIG. 3 .
  • the system 100 then automatically navigates the user to the spotting tool 112 to listen to one or more audio clips associated with the selected link which had been created by the production team (e.g. sound effects studio).
  • the system 100 allows the user to then navigate back to the script tool 102 to view other portions of the script.
  • the script tool 102 is able to also link portions of the scripts to other portions of information not directly related to the scene, such as actor bios and contact information and/or the equipment rental company which will need to be contacted to handle the shot, take or scene which corresponds to the marked phrase.
  • the word FACE 18 may be linked to the actor who's face will appear in the motion picture for that scene.
  • the system may automatically display the actor, her biography, her and her agent's contact information as well as any other information which may pertain to the particular scene in the script (e.g. equipment rental company, etc).
  • the script tool 102 allows the user to highlight one or more words in the script by any appropriate method (e.g. click and drag; point and select; search query, etc). Once the one or more words are selected, the user is able to select a marking tool from a menu in the system 100 , whereby the menu provides the user all of the destination tools where the selected item(s) may be linked to. The user then selects one or more of the desired destination tools (e.g. script tool, viewer tool, spotting tool, etc). The system 100 then creates a hyperlink for the selected word(s) and places the hyperlink between each of the selected destination tools and the selected phrase. It is also possible to mark the portions of the script from another tool in the system 100 .
  • any appropriate method e.g. click and drag; point and select; search query, etc.
  • a user viewing a particular scene in the viewer tool 110 may assign that scene or portions of the dialogue to the script directly from the viewer tool. This allows the user to easily mark the script without having to go to the script tool.
  • Hyperlink creation and management is known in the art and not discussed in detail herein.
  • the script tool 102 allows a portion of the script that is already marked to be further marked to different destinations. For example, as shown in FIG. 3 , the portion of the script that is designated with reference numeral 2 is already marked and is linked to the viewer tool.
  • the script tool 102 allows the visual description SERIES OF DISTORTED IMAGES 14 to be further marked and linked only to the spotting tool 112 , for example. Thus, upon selecting on the SERIES OF DISTORTED IMAGES 14 phrase, the system will navigate the user to the spotting tool 112 .
  • the script tool 102 may be configured to provide the user the option to navigate to the viewer tool 110 to see the SERIES OF DISTORTED IMAGES 14 when clicking on the marked paragraph having the reference numeral 2 .
  • the system 100 navigates the user to a common page (not shown) which displays all of the destination tools where more information of the marked word(s) may be found.
  • the system is configured to allow the user to link any two or more items together such that relevant or related information is kept together to aid in collaboration.
  • One example is to link characters to the scenes in which they appear or linking a PDF to an event on the calendar.
  • linking is used to relate items together but can be used for other purposes like creating shortcuts instead of hyperlinking.
  • the system 100 includes the organizer tool 104 in accordance with an embodiment.
  • the organizer tool 104 allows the user to categorize content associated with the motion picture.
  • the user utilizes the organizer tool 104 to designate content into bins, whereby the bins may be organized based on the script and/or a general outline upon which the motion picture is to be based.
  • the bins are organized by scene, slate, take, shot, etc. to correspond with the direction or story of the motion picture,
  • the bins are configured to be linked to one or more stored media content files, whereby selecting a particular bin will provide video, audio and text associated with each the designated scene, slate, take and/or shot for that bin.
  • the organizer tool 104 provides a reconfigurable and scalable tree of the entire motion picture which allows the production team to break down the motion picture into easily manageable categorized portions, whereby each categorized portion provides the production team members all the necessary information to effectively collaborate, plan and execute that portion.
  • FIG. 4 illustrates a sample screen shot of the organizer tool 104 in accordance with an embodiment.
  • the organizer tool 104 displays several scene tabs, each of which is associated with a particular bin.
  • the scene tabs are shown in FIG. 4 ranging from Scene 13 to Scene 17 .
  • the scene tabs may include a brief description of the scene, which may be entered manually into the system or may imported from the script tool 102 . Alternatively, the scene tabs do not contain any description therein.
  • each scene bin once selected, displays one or more slates associated with the scene bin.
  • Scene 16 bin has been selected, whereby Scene 16 includes several slate tabs having Slates 49 - 52 .
  • Each slate tab is associated to a corresponding slate bin and may include a brief description of the slate, as shown in FIG. 4 . This description in each slate tab may be manually entered into the system or may imported from the script tool 102 . Alternatively, the slate tabs do not contain any description therein.
  • Each slate bin once selected, displays one or more takes associated with that slate bin.
  • Slate bin 52 is shown to be selected, whereby Slate bin 52 includes several Takes bin tabs having Takes bins 52 - 1 , 52 - 4 and 52 - 7 .
  • Each Take bin tab is associated with a corresponding Take bin and may include a brief description of the Take, as shown in FIG. 4 . Alternatively, the Take tabs do not contain any description therein.
  • Each Take bin tab once selected, displays one or more shots associated with the take.
  • Take bin 52 - 4 is selected, whereby Take bin 52 - 4 includes Shot Tabs 52 -A 1 and 52 -B 1 .
  • Each Shot Tab is associated with a corresponding Shot Bin and may include a brief description of the shot, as shown in FIG. 4 . This description in each Shot tab may be manually entered into the system or may imported from the script tool 102 . Alternatively, the Shot tabs do not contain any description therein.
  • Each Shot tab once selected, displays all pertinent data that is associated with the shot. Such data may be carried over from information originally entered in the scene tab (Scene 16 ), although not necessarily.
  • Such information may include, but is not limited to, video and/or audio clips of the shot, contact information of the actors in the shot, notes, sound and/or visual effects, production budgets, still shots of the scene, location information and other types of information which would be beneficial for the production member.
  • This information may be manually entered or may be imported into the system.
  • a hyperlink may be included to navigate the user to the script tool 102 to view the portion of the script which refers to the particular scene/shot.
  • the content bins are selectively organizeable in an embodiment.
  • the organizer tool 104 is configured to allow the user to move the bins and thereby reorganize the structure of the Scenes, Slates, Takes and Shots in any desired manner.
  • the Scene tabs may can be moved to change of the order of where a particular scene is to be located with respect to the other scenes.
  • Scene 15 in FIG. 4 may be moved to be in between Scenes 13 and 14 by clicking Scene Tab 15 and dragging it between Scenes Tabs 13 and 14 .
  • the movement of the scene tab is noted by the system, whereby the system displays that the scene (e.g. Scene 15 ) was originally and/or previously between Scenes 13 and 14 .
  • the system is accordingly updated to reflect the change by making a note on the organizer tool 104 , version tool 114 and/or by actually moving the text of the script associated with the Scene 15 to be between the text associated with Scenes 13 and 14 .
  • the system automatically updates the numbering so that Scene Tab 15 is renumbered to become Scene Tab 14 when moved to the position after Scene Tab 13 .
  • the user may simply click on a Scene bin in the organizer tool 104 to access all information (e.g. content for all slates, shots, takes) regarding that particular scene at once.
  • the user may select a Scene bin and be directed stepwise to a narrower approach of the information in the Scene bin (e.g. view only the available slates in the scene bin). For example, as shown in FIG. 5 , upon clicking on Scene Tab 16 , the user is able to view a vast array of information that is associated to that particular scene. For example, the user is able to view all the Slates, Takes and Shots associated with Scene 16 using the viewer tool 110 as shown in FIG. 5 .
  • the viewer tool 110 incorporates the viewer tool 110 wherein the user can view video clips, audio clips, graphical and/or textual information assigned to that particular scene.
  • the organizer tool 104 thereby allows the user to easily view, comment on and/or select the takes or shots for each scene by merely selecting the particular Scene tab.
  • the viewer tool 110 is associated with the organizer tool 104 and allows one or more members of the production team to view and collaborate on content in a selected bin
  • the viewer tool 110 includes a primary workspace section 502 , a Notes Section 504 , a secondary workspace section 506 , and a scene selector section 508 .
  • the video in the primary workspace section 502 is preferably associated with the selected scene (Scene 16 in FIG. 5 ), although other video or audio content is contemplated.
  • the user is alternatively able to view a still shot, sketch, personnel contact information, the script, notes, or other appropriate information in the area.
  • the layout of the viewer tool has a particular configuration in FIG. 5 , it may have another configuration or layout and is thus not limited thereto.
  • the secondary workspace 506 shown in FIG. 5 is capable of displaying any or all the material that is displayable in the primary workspace 502 .
  • the secondary workspace may be smaller in size to the primary workspace 502 , although the user can dynamically customize the secondary workspace 506 to be equal or greater in size compared to the primary workspace 502 .
  • the secondary workspace 506 is used to provide additional information to the user while the user is viewing a movie clip or other data.
  • the user is able to view video files in separate video players in the primary and secondary workspaces 502 , 506 .
  • the video clips of different takes may be compared in the primary and secondary workspaces.
  • a particular take may show the subject from one camera angle in the primary workspace 502
  • a different camera angle from the same take is displayed in the secondary workspace 506 .
  • the timing of the clips in the primary and secondary workspaces are synchronized so that the user is able to view the differences between the two synchronized clips.
  • a take from one scene may be played in the video player in the primary workspace 502 while another take from the same scene is played within the video player in the secondary workspace 506 or vice versa.
  • the primary workspace can display a scene where the actor waves with his right hand whereas the user plays a file in the secondary workspace of a different take in which the subject is waving with his left hand. This allows the user to compare the different takes to determine which to use in the cut.
  • the secondary workspace 506 may also display a list of files associated with the particular scene, whereby the files are from a drop-down menu indicated by reference 510 . The user is able to select a file in the list to view it on the primary workspace 502 and/or secondary workspace 506 .
  • the secondary workspace 506 may display the script in the script tool 102 while audio and/or video is played back in the primary workspace 502 .
  • the system may highlight portions of the script as the video and/or audio is played in the primary workspace to allow the user to compare the script with what was actually shot in the take. This would require the video and/or audio to be synchronized with the script via a timestamp or other synchronizing method.
  • users can post messages to be shared with all the users who have access to the organizer tool 104 or to one or more particular users who have access to the organizer tool 104 .
  • the notes posted may be directed to suggestions or feedback regarding the scene or any other related matter, although not necessarily.
  • the Notes section states that the particular moment in the scene should be moved to timestamp 0028+00
  • the notes may be stored on each user's account, server, or a local computer, whereby the user can remove or highlight one or more notes without affecting the notes on another user's Notes Section.
  • the notes are uploaded and displayed periodically, although the notes may be updated in real time.
  • the user can choose the preferred as well as alternative slates, takes and shots for each scene in a sneak preview mode.
  • the system 100 may then play back the entire motion picture from beginning to end by playing the selected digital audio and video files of the preferred slates, takes, and shots in the order of the scenes (i.e. Scene 1 , then Scene 2 , then Scene 3 , etc.).
  • the system 100 may have a feature to play back the entire or a portion of the motion picture from beginning to end by playing the selected digital audio and video files with selected alternative slates, takes or shots. This allows the user to compare two different versions of the motion picture as a whole.
  • the system may allow the user to switch between the preferred and an alternate (or vice versa) shots at any time while the system 100 plays back in the sneak preview mode.
  • the system may allow the viewer to view two or more versions of the motion picture simultaneously to allow the viewer to compare the versions.
  • digital media is uploaded to the system directly from the movie studio.
  • digital media is uploaded from members of the production staff and outside sources.
  • the system also is capable of downloading the digital media to diskette, compact disk, flash drives, servers and/or portable or non-portable media playback devices.
  • the digital media files may be stored in a memory on the server 204 or in a separate memory 206 , whereby the files in the memory are able to be easily retrieved from the client terminal.
  • the digital media files are stored on the client application, whereby the system will upload the file automatically or at a designed time.
  • peer to peer file sharing is performed between terminals for a particular project, whereby the system tracks the source and destinations computers which are sharing the digital file.
  • the digital video files are uploaded in any appropriate format (e.g. avi, mpeg, H.264, etc.). Audio files are uploaded or streamed in an appropriate format as well (mp3, mp4, way, wma, asx, ACC, it etc).
  • the system 100 Upon uploading the files to the system 100 , the system 100 prompts the user as to where the uploaded file is accessible on the system 100 . In particular, the system 100 will request whether the file being uploaded is to be a video clip that will be in the motion picture, and if so, which scene, slate, take and shot the clip is to be located in. Upon the user designating the destination of the file, the system 100 stores the file and places a link to the file at the proper location on the system browser.
  • FIG. 6 illustrates a screen shot of another type of layout of the viewer tool 110 in accordance with an embodiment.
  • the viewer tool 110 allows for posting of content for approval and revision requests, whereby members of the production team can view and approve or request revisions quickly and conveniently.
  • the viewer tool 110 may display still shots, video, concept art, storyboards, animatics, motion capture files, models, animation files, audio, and/or other content.
  • the viewer tool 110 is configured to handle the format of the digital content for proper playback.
  • the viewer tool provides the time code (TC) of the content during playback in an embodiment.
  • the time coded material may be presented by the viewer tool 110 in SMPTE format or FF format.
  • the viewer tool 110 allows the user to place a marker on the playback of the video (or audio), whereby the marker corresponds to the time code content.
  • the marker allows members of the production team to comment on an exact point of content and communicate exactly where in the playback the comment applies.
  • a marker icon (or the actual time stamp) is displayed next to the note in an embodiment. A person reading the note can then select the marker, whereby the viewer tool 110 will playback the content from that exact time stamp.
  • the bottom portion of the viewer tool shows a screen that a user might use to place a note to another person.
  • the note may be posted as in a bulletin board or may be sent as a message.
  • the viewer tool 110 may be a stand alone tool, as shown in FIG. 6 .
  • the system 100 includes a spotting tool 112 in accordance with an embodiment.
  • the spotting tool 112 allows dialogue, voice-overs, sound effects and music to be added to the chosen scenes, takes and shots. Dialogue and music may be added at anytime during the production of the motion picture, although dialogue, voiceovers, and music are usually applied in post-production.
  • the audio files are uploaded onto the system 100 by one or more members of the production team, although any other persons may upload the data as well.
  • the uploaded audio files can be linked to the marked time stamp discussed above, whereby playback of the video will automatically playback the audio from the spotting tool 112 in synchronization.
  • the spotting tool 112 allows one or more users to make notes, and thus comment, on the audio files.
  • the audio files may be time stamped as well, whereby the time stamp corresponds with the time stamp of the video and/or has a separate time stamp of its own. Therefore, a person can separately mark (and comment) on the audio file in context of the video as well as a portion of the audio file itself.
  • the system 100 includes a version tool 114 in an embodiment,
  • the version tool displays the versions of the project, the tasks, scenes set up, the script, digital and audio files as well as who requested the revision, when the revision was requested, and what the status is of the revision. It is also possible for a user to enter notes about the revision for review by others. These notes can be used for follow up by the production team members.
  • FIG. 7 illustrates a block diagram of the production tracking tool 106 in accordance with an embodiment.
  • the production tracking tool 106 enables the management of one or more pending projects by providing the user with the necessary information to track the progress of the project and evaluate yet to-be-completed items.
  • the production tracking tool 106 may be configured to present a one page summary of current, past and future production status information.
  • the production tracking tool 106 includes a calendar tool 302 , a task assignment tool 304 , a personnel/contacts tool 306 , a note tool 308 , and a location tool 310 .
  • the production tracking tool 106 may include additional and/or other components which would be useful in the management of a motion picture project.
  • This information defined herein as profile information, can be linked to the scenes, takes, shots and/or slates, such that the profile information is automatically displayed to the user when reviewing that particular scene, take, shot and/or slate.
  • the system 100 stores the project status information in a database which may be accessed, viewed and/or modified (by authorized users).
  • the production tracking tool 106 allows viewing of the current status of the project, or project history. Every item involved in the project (“production item’) in the database may be accessed by search regarding its current status. As each production item is created by members of the production team, it is stored in the system's memory. The items added to the system may be required to be approved by one or more other members of the production team before it is stored.
  • the calendar 302 in FIG. 7 allows the creation and maintenance of the production schedule of the project.
  • certain projects in the production of a motion pictures may be broken down and categorized further into sub-projects. These sub-projects may be organized and tracked using the production tracking tool 100 .
  • the calendar may provide information in timeline, calendar or other view for both past, ongoing and future tasks. The information in the calendar is able to be updated in real-time to allow members of the production team to view the status of the project.
  • the calendar tool 302 may be configured to automatically notify one or more persons regarding a schedule change.
  • the calendar tool 302 can display additional information that is not event-based, such as files, contact information, notes, or other profile information as discussed herein. Some examples include placing characters, locations, and script elements on days or weeks for easy access.
  • the calendar tool 302 thus allows items to be placed directly on the calendar versus simply displaying events with items linked to them.
  • the production tracking tool 106 includes a task tool 304 , as shown in FIG. 7 .
  • the task tool allows one to view past, current and future assignments.
  • the task tool 304 may be configured to assign a particular task to one or more persons, whereby the persons are automatically notified of their assignments.
  • the task tool 304 allows pertinent information to be input including, but not limited to, amount of time to complete the task, desired due date, budgeted cost of performing the task (in terms of cost of labor and materials), etc. Other tasks may be arranged to be dependent on the particular task, whereby the assigned person cannot receive the necessary information to complete any future tasks until the current assigned task is completed.
  • the task tool 304 may include a feature which calculates the amount of time needed to complete several tasks for the assigned person to create an acceptable workload for the person.
  • the task tool 304 may be configured to interface with the calendar 302 to identify any scheduling conflicts.
  • the task tool 304 may configured to interface with the budget tool 116 ( FIG. 1 ) to provide cost information for each task.
  • the task tool 304 can be configured to include one or more “playlists” which is a group or a series of items in a particular order.
  • the system allows one or more playlists to be sent from one user to another, irrespective of security controls placed on the items in the playlists, whereby the playlist includes a number of tasks and profile information (e.g. images, sounds, video clips, zip archives, pdf documents) in a particular order, such that the receiving person can execute the tasks in the given order to make completing the task easier and more efficient.
  • the personnel/contacts tool 306 shown in FIG. 7 allows the members of the production team to view contact information of all persons who are or may be involved in the project, although not necessarily. Such persons include actors, directors, producers, production engineers, staff, vendors, etc. Specific information as well as multimedia data may be included for each contact in the contact list. For example, digital clips of prior movies which have been done by a particular actor may be included in the actor's profile in the contacts tool 306 . This particular feature may allow members of the production team to preview the type of role that the particular actor may be best served in (e.g. support actors, stunt doubles, etc.)
  • the note tool 308 shown in FIG. 7 allows for communication between persons on the production team regarding the project.
  • the note tool 308 facilitates collaboration regarding the project by allowing the users to effectively communicate to one another regarding particular scenes, slates, takes, as well as shots (or aspects thereof) so that necessary adjustments or changes may be made efficiently and quickly, as discussed more below.
  • the note tool 308 may take the form of postings (such as on a bulletin board), although the note tool 308 may include features like instant messaging, email, fax, Internet calls, etc.
  • the note tool 308 includes verifiable transmission of the note and reception by the intended recipient of the note.
  • the notes are editable and are able to be deleted by authorized users in an embodiment.
  • the note tool 308 may be a separate component from the other tools in the system, the note tool 308 may be integrated within one or more other tools (e.g. viewer tool, script tool, etc.).
  • the production tracking tool 106 may also include a location tool 310 as shown in FIG. 7 .
  • the location tool 310 may provide information for all the desired locations where the motion picture may be shot, whereby the locations may be indoor or outdoor areas.
  • the location tool 310 may be linked up with the script tool 102 to allow the user to set and/or view the location where the portion of the script should be shot.
  • the location tool 310 may be linked to the personnel/contact tool 306 and tasks tool 304 to assign and organize the persons who will be present at the particular location.
  • the location tool 310 may be linked to the note tool 308 to provide notes and/or additional information of the locations.
  • the location tool 310 may have a field to allow the user to insert the cost of using the location for a desired amount of time, whereby the location tool 310 may be linked to the budget tool 116 and may update the budget accordingly.
  • the budget tool 116 allows authorized persons in the production team to manage and edit budgeting of the particular project or sub-projects.
  • the budget tool 116 may provide accounting as well as analysis of the budget for the project or sub-projects as well a track and analyze invoice details.
  • the budget tool 116 may include any other features which are found in accounting software programs.
  • FIG. 8 illustrates a block diagram of the administrator tool of the system in accordance with an embodiment.
  • the administrator tool 108 provides all the administrative functions which are used to create and maintain user accounts and logins (block 402 ), set up and manage security (block 404 ) and access (block 406 ), generate reports (block 408 ) and manage files (block 410 ).
  • the administrator tool 108 is run by the administrator who has sole internal rights and access to the server 204 ( FIG. 2 ).
  • the administrator tool 108 allows the administrator to modify, add, and delete user groups. Within each user group, the administrator may also add or remove individual users.
  • the administrator tool 108 allows the creation and modification of levels of access of each user.
  • the administrator tool 108 also provides access to reporting features of the system 100 , whereby the reporting feature allows convenient access for management staff to track the progress, efficiencies, and status check for the system, as necessary.
  • the reporting feature may include, but is not limited to, a daily operation reports, custom system report, workload reports, employee reports, accounting reports, past due tasks reports, cost/budget reports, and production reports.
  • the administrator tool 108 allows for file management rights in the system.
  • the administrator tool 108 allows the administrator to set and grant uploading and downloading rights to appropriate individuals as well as monitor (and limit) the amount of uploading and downloading permitted by the system 100 .
  • the administrator tool 108 tracks movement of all files in the system 100 as well as corrects any problems associated with file management.

Abstract

A system and method configured to allow production team members to view, organize and manage production of a motion picture. The system and method allows the production team members to organize the project script; extract the script into scenes and slates; view, organize, select and collaborate on uploaded digital media and takes for each scene as well as uploaded audio, music or special effects that are to be applied to the product. The system tracks versions of the project by monitoring each change as well. The system also allows production member to organize project calendars, tasks associated with projects, contacts, notes, budgeting and other facets of the project. The system incorporates security measures which allow certain members of the production team to only have access to designated high-security material.

Description

    STATEMENT OF RELATED APPLICATION(S)
  • The present application is a Continuation of U.S. application Ser. No. 11/821,276 filed on Jun. 21, 2007 which claims the benefit of priority based on U.S. Provisional Patent Application Ser. No. 60/815,968 filed on Jun. 22, 2006, in the name of Eric B. Dachs, entitled “System and Method for Web Based Collaboration of Digital Media”, each application is hereby incorporated by reference in its entirety.
  • TECHNICAL FIELD
  • The subject matter described herein relates to a system and method for web-based collaboration and project management using digital media over a network.
  • BACKGROUND
  • Motion pictures or other content such as movies, short films, television shows, commercials and music videos, are produced over an extended time and involves a vast amount of communication, time and collaboration to reach a final product. The emergence and improvements in computer hardware and software as well as the commercial viability of the Internet and quick upload and download times have provided the ability for people around the world to communicate with one another.
  • What is needed is a system and method which allows persons to efficiently and conveniently manage, produce and collaborate on a motion picture or other digital media project.
  • BRIEF DESCRIPTION
  • A system and method to allow motion picture and other content production team members to view and organize the project script; extract the script into scenes, slates and takes. The system and method allows production team members to view, organize, select and collaborate on uploaded shots and takes for each scene as well as communicate regarding uploaded audio, music or special effects that are to be applied to the motion picture. The system also tracks versions of the project by monitoring each change and allows production members to organize project calendars, tasks associated with projects, contacts, notes, budgeting and other facets of the project.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are incorporated into and constitute a part of this specification, illustrate one or more embodiments and, together with the detailed description, serve to explain the principles and implementations of the system and method.
  • In the drawings:
  • FIG. 1 illustrates a block diagram of the system in accordance with an embodiment.
  • FIG. 2 illustrates a schematic diagram of the system architecture in accordance with an embodiment.
  • FIG. 3 illustrates a screen shot of a script tool of the system in accordance with an embodiment.
  • FIG. 4 illustrates a schematic of the script tool of the system in accordance with an embodiment.
  • FIG. 5 illustrates a screen shot of the script tool of the system in accordance with an embodiment.
  • FIG. 6 illustrates a screen shot of a viewer tool of the system in accordance with an embodiment.
  • FIG. 7 illustrates a block diagram of a production tracking tool of the system in accordance with an embodiment.
  • FIG. 8 illustrates a block diagram of an administrator tool of the system in accordance with an embodiment.
  • DETAILED DESCRIPTION
  • Embodiments are described herein in the context of a system of computers, servers, and software. Those of ordinary skill in the art will realize that the following detailed description is illustrative only and is not intended to be in any way limiting. Other embodiments will readily suggest themselves to such skilled persons having the benefit of this disclosure. Reference will now be made in detail to implementations of the present system and method as illustrated in the accompanying drawings. The same reference indicators will be used throughout the drawings and the following detailed description to refer to the same or like parts.
  • In the interest of clarity, not all of the routine features of the implementations described herein are shown and described. It will, of course, be appreciated that in the development of any such actual implementation, numerous implementation-specific decisions must be made in order to achieve the developer's specific goals, such as compliance with application- and business-related constraints, and that these specific goals will vary from one implementation to another and from one developer to another. Moreover, it will be appreciated that such a development effort might be complex and time-consuming, but would nevertheless be a routine undertaking of engineering for those of ordinary skill in the art having the benefit of this disclosure.
  • In accordance with this disclosure, the components, process steps, and/or data structures described herein may be implemented using various types of operating systems, computing platforms, computer programs, and/or general purpose machines. In addition, those of ordinary skill in the art will recognize that devices of a less general purpose nature, such as hardwired devices, field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), or the like, may also be used without departing from the scope and spirit of the inventive concepts disclosed herein. It is understood that the phrase “an embodiment” encompasses more than one embodiment and is thus not limited to only one embodiment. Where a method comprising a series of process steps is implemented by a computer or a machine and those process steps can be stored as a series of instructions readable by the machine, they may be stored on a tangible medium such as a computer memory device (e.g., ROM (Read Only Memory), PROM (Programmable Read Only Memory), EEPROM (Electrically Eraseable Programmable Read Only Memory), FLASH Memory, Jump Drive, and the like), magnetic storage medium (e.g., tape, magnetic disk drive, and the like), optical storage medium (e.g., CD-ROM, DVD-ROM, paper card, paper tape and the like) and other types of program memory.
  • For purposes of the description, the term “motion picture” is used herein to generally describe the digital content which is organized and utilized in producing a finished work, whether the finished work is a full length movie, short film, television show, commercial, music video, video game, skit, play, performance, promotional work, animated work, and/or other content involving a video component. The process of making a motion picture involves a substantial amount of work in the pre-production, production, and post-production stages. In pre-production, a script may be generated, whereby the script serves as the backbone or structure upon which the motion picture is based. The script may include, among other things, concepts, designs, wardrobes, settings, locations, characters, dialogue, sequences and most importantly, the story. One aspect of pre-production involves a series of sketches or stills which serve to provide an overall look and feel of what each scene will generally look like in the motion picture. It is from the script that the scenes are set up, slated and shot. Although the script may slightly change during the course of production of a motion picture, the script serves as the backbone of the film. Typically, the producers, directors, editors as well as the rest of the production team strictly follow the script from beginning to the end in making the motion picture. It is during the pre-production phase that the film is “set up”, and a substantial amount of collaboration is present between members of the production team to get the project ready to move ahead to the production phase.
  • The production phase of the project usually takes the longest time to complete in the course of making the motion picture. The production phase includes setting up the scenes; slating the scenes; and shooting the scenes, many times with multiple takes. It is during the production phase that the film is “formed.” Thus, the members of the production team continue to collaborate in getting the project onto the final phase of the project; post-production.
  • The post-production process includes further editing, applying voice-overs to dialogue in the scenes; adding video and audio effects; adding music sequences; applying color correction and lighting; and final editing. The decision as to which takes and shots are to be in the final product occurs in the production stage as well as the post-production stage. After the motion picture is released or broadcast to be viewed by the audience, follow up actions are taken such as providing bonus features and commentaries for DVDs and other after market products.
  • The system and method is directed to a powerful and robust tool which allows members of the production team to quickly and efficiently perform the tasks necessary in pre-production, production and post-production. The system can be used for a project from inception of the script to the final editing phase. The system may be used to collaborate and manage only one project at a time or multiple projects occurring simultaneously. In general, the system allows production team members to view and organize a script; extract the script into scenes and slates; view and collaborate on shots and takes for each scene as well as communicate to one another regarding audio, music or special effects applied to the motion picture. The system tracks versions of the project by tracking each change as well. The system also allows production member to organize project calendars, tasks associated with projects, contacts, notes, budgeting and other facets of the project.
  • FIG. 1 illustrates a block diagram of the system in accordance with an embodiment. As shown in FIG. 1 the system 100 includes a script tool 102, an organizer tool 104, a production tracking tool 106, an administrator tool 108, a viewer tool 110, a spotting tool 112, a version tool 114, and a budget tool 116. It should be noted that the system 100 may include additional and/or other components which would be useful in the production of a motion picture project, such as, for example, an editing tool.
  • The script tool 102 organizes and categorizes the script into sections upon which the motion picture is organized. The organizer tool 104 is a tool which categorizes and organizes the script as well as stored media files in a manner which the scenes, slates, takes and shots are based. The production tracking tool 106 enables management of one or more pending projects by providing the user with the necessary information to track the progress of the project and evaluate yet to-be-completed items. The administrator tool 108 provides the administrative functions which are used to create and maintain user accounts, set up and manage access and security, generate reports and manage files. The viewer tool 110 provides a user interface which displays video and audio as well as allows collaboration and communication among production team members. The spotting tool 112 allows dialogue, voice-overs, sound effects and music to be added to the chosen scenes, takes and shots. The version tool 114 and budget tool 116 provide version and budget information of items in a project.
  • FIG. 2 illustrates an overall architecture in which the system 100 in accordance with an embodiment. The system is configured to allow one or more production team members at multiple remote locations to input, view, and modify (if given permission) data as well as collaborate with others in the production of the motion picture. The various tools of the system are software modules which allow the user to manage the features of the system 100. In an embodiment, the system 100 is modular and is created using an object oriented programming language to allow easy and efficient system modifications and updates.
  • As shown in FIG. 2, several computer terminals 202 are connected to the system 100 via the Internet 99. Alternatively or additionally, a wired local area network (LAN), wireless local area network (WLAN) serves to connect the computer terminals 202 to the system 100. The system software resides on the server 204 in an embodiment. In an embodiment, data related to the operation of the system is stored on memory storage area modules 206 which are located on the server 204 and/or remote from the server 204. In an embodiment, data files relating to the content are uploaded onto the server 204 and stored in the memory 206. In another embodiment, data files relating to the content are uploaded and kept on a computer terminal 202B at the production studio (e.g. editor's studio), whereby the data files are accessed through the terminal 202B at the studio.
  • In an embodiment, the system can be used offline on a single computer or can be used by multiple users on multiple systems over a private or public network. In an embodiment, the system can operate in a decentralized fashion utilizing an ad-hoc network of peer computer systems. In this fashion, content and media can be delivered peer-to-peer by one client to another. No one particular computer system may contain all project information, but collectively all information is retained on one or more computers. A degree of redundancy can be included so that project and content availability is not impacted by changing network or computer conditions.
  • In an embodiment, all network operations are secured by security hardware and software, in addition to internal private addressing schemes and multiple domain structure for increased security. Point to point communication may be encrypted over a Virtual Private Network (VPN) with private addressing on both ends. The tools may be isolated into individual domains to enhance security. The system may include one or more firewall and packet filters for enhanced security.
  • The system 100 may include security features which prevent tampering or unauthorized viewing of the content of the project in an embodiment. In an embodiment, each person using the system is assigned a security clearance rating which gives the person access to some or all of the features and/or content in the system, depending on his or her rating. In addition, one or more of the system's interactive features (e.g. script tool, production tool, etc.) may have an access rating assigned to it such that only those users having a matching security clearance rating or higher may access the features. In addition, content which is uploaded or already stored in the system may have a same or different access rating assigned to it to allow viewing of the content to those who have security clearance of that assigned rating or higher. The entire content may receive one access rating or each file containing content may be selectively assigned same or different access ratings. In an embodiment, the administrator (or person having a predetermined security clearance) of the system or of a particular project may assign the access ratings to the features, content and/or users. In an embodiment, the security feature tracks and analyzes each individual tool's operations and monitors system performance to ensure that the system is not hacked.
  • As shown in FIG. 2, the system is accessed via the Internet in an embodiment, whereby multiple users may access same or different features of the system 100 at the same time or different times. For example, as shown in FIG. 2, terminals 202A and 202G are shown accessing the administrator tool 108, terminal 202C is shown accessing the viewer tool 110, terminal 202D is shown accessing the script tool 102 and the viewer tool 110, terminals 202E and 202F are shown accessing the EFX tool 120. In an embodiment, the system 100 updates all data input into the system automatically to allow any modifications, notes and/or messages to be seen by one or more users in real time when accessing the system at the same time. The system thereby allows users or different groups of users to collaborate in real time on different levels simultaneously. Again, as stated above, the system is alternatively a peer to peer network in which data is shared among computers which are not necessarily linked to one or more central servers.
  • The system 100 includes a script tool 102 in accordance with an embodiment. The script tool 102 organizes and categorizes the script into sections upon which the motion picture is organized. The script for a motion picture serves as the structure upon which the scenes, slates, locations, dialogue, camera angles, characters, and all other information upon which the motion picture is based and organized. Although the script tool 102 may be a powerful feature of the system 100, it is an optional tool. For example, the production team may use a paper version of the script and manually generate the scenes, set ups and slates in the script tool 102 which are eventually used by the system in producing the motion picture.
  • In an embodiment, the script tool 102 allows the script to be imported into the system 100. In particular, the script may be on paper and then scanned into the system, whereby optical character recognition (OCR) software converts the scanned document into an appropriate format for the system 100. In an embodiment, the imported script may be able to be edited in the organizer tool 104 using a word processing program. Alternatively, the script is read-only not editable, and is thus placed on the system only for viewing. In an embodiment, the script is directly typed into the system using a word processing program. It is contemplated that the script for a particular project may evolve or portions be rewritten over the course of the project. Thus, the system may store multiple versions and/or drafts of the script for later viewing.
  • FIG. 3 illustrates a script imported into the script tool 102 of the system 100. A script in general includes dialogue, location settings, visual and/or audio descriptions of events as well as camera effects, character names, designation of visual and/or audio effects, and other information. In an embodiment, system 100 allows the user to link the script in the script tool 102 to any or all of the other tools of the system 100. Thus, the script tool 102 allows the user to view all the information associated with a selected portion of the script, from an entire scene to a particular word in the script.
  • In particular, as shown in FIG. 3, the script tool 102 allows the user (e.g. the director, producer) to select and link any portion of the script to the organizer tool 104 and/or any other tools of the system 100. For example, the portion of the script in FIG. 3 which states SERIES OF DISTORTED IMAGES 14 may be “marked” or assigned to be hyper-linked to the corresponding scene in the organizer tool 104. Upon being marked, the system allows the user to move the cursor on the computer display screen to SERIES OF DISTORTED IMAGES and click on the marked phrase. In an embodiment, the system 100 will then automatically navigate the user to the script tool 102, whereby the user will be able to view the script tool 102 and see the associated information which has been entered in regards to Scene 2. For instance, the system may allow the user to navigate from the script tool 102 to the viewer tool 110 to see all the shots associated with Scene 2 in the script bins as well as notes, comments or other information, discussed below. In an embodiment, the system 100 allows the user to navigate from the script tool 102 to the spotting tool 112 to listen to dialogue, sound effects, music score, or other audio which may be incorporated into the particular scene. For example, the user is able to individually click on the SUDDEN MOVEMENT 10, BARELY AUDIBLE SOUNDS 12 and/or VOICE, LAUGHTER, PUBLIC ADDRESS ANNOUNCEMENTS 16 links in the script tool 102 shown in FIG. 3. The system 100 then automatically navigates the user to the spotting tool 112 to listen to one or more audio clips associated with the selected link which had been created by the production team (e.g. sound effects studio). The system 100 allows the user to then navigate back to the script tool 102 to view other portions of the script.
  • The script tool 102 is able to also link portions of the scripts to other portions of information not directly related to the scene, such as actor bios and contact information and/or the equipment rental company which will need to be contacted to handle the shot, take or scene which corresponds to the marked phrase. For example, the word FACE 18 may be linked to the actor who's face will appear in the motion picture for that scene. Thus, upon selecting FACE, the system may automatically display the actor, her biography, her and her agent's contact information as well as any other information which may pertain to the particular scene in the script (e.g. equipment rental company, etc).
  • In marking portions of the script to be linked, the script tool 102 allows the user to highlight one or more words in the script by any appropriate method (e.g. click and drag; point and select; search query, etc). Once the one or more words are selected, the user is able to select a marking tool from a menu in the system 100, whereby the menu provides the user all of the destination tools where the selected item(s) may be linked to. The user then selects one or more of the desired destination tools (e.g. script tool, viewer tool, spotting tool, etc). The system 100 then creates a hyperlink for the selected word(s) and places the hyperlink between each of the selected destination tools and the selected phrase. It is also possible to mark the portions of the script from another tool in the system 100. For example, a user viewing a particular scene in the viewer tool 110 may assign that scene or portions of the dialogue to the script directly from the viewer tool. This allows the user to easily mark the script without having to go to the script tool. Hyperlink creation and management is known in the art and not discussed in detail herein.
  • In an embodiment, the script tool 102 allows a portion of the script that is already marked to be further marked to different destinations. For example, as shown in FIG. 3, the portion of the script that is designated with reference numeral 2 is already marked and is linked to the viewer tool. The script tool 102 allows the visual description SERIES OF DISTORTED IMAGES 14 to be further marked and linked only to the spotting tool 112, for example. Thus, upon selecting on the SERIES OF DISTORTED IMAGES 14 phrase, the system will navigate the user to the spotting tool 112. However, the script tool 102 may be configured to provide the user the option to navigate to the viewer tool 110 to see the SERIES OF DISTORTED IMAGES 14 when clicking on the marked paragraph having the reference numeral 2.
  • In an embodiment, once the user selects the linked word(s), the user is requested to select which of the destination tools he or she would like to navigate to. In another embodiment, the system 100 navigates the user to a common page (not shown) which displays all of the destination tools where more information of the marked word(s) may be found. It should be noted that although linking of the script to video and/or audio is described herein, the system is configured to allow the user to link any two or more items together such that relevant or related information is kept together to aid in collaboration. One example is to link characters to the scenes in which they appear or linking a PDF to an event on the calendar. Generally, linking is used to relate items together but can be used for other purposes like creating shortcuts instead of hyperlinking.
  • The system 100 includes the organizer tool 104 in accordance with an embodiment. As stated above, the organizer tool 104 allows the user to categorize content associated with the motion picture. In particular, the user utilizes the organizer tool 104 to designate content into bins, whereby the bins may be organized based on the script and/or a general outline upon which the motion picture is to be based. In particular, the bins are organized by scene, slate, take, shot, etc. to correspond with the direction or story of the motion picture, The bins are configured to be linked to one or more stored media content files, whereby selecting a particular bin will provide video, audio and text associated with each the designated scene, slate, take and/or shot for that bin. For instance, by selecting a bin associated with a particular scene, the user is able to view all or selected media files designated with that scene. As discussed below, the organizer tool 104 provides a reconfigurable and scalable tree of the entire motion picture which allows the production team to break down the motion picture into easily manageable categorized portions, whereby each categorized portion provides the production team members all the necessary information to effectively collaborate, plan and execute that portion.
  • FIG. 4 illustrates a sample screen shot of the organizer tool 104 in accordance with an embodiment. As shown in FIG. 4, the organizer tool 104 displays several scene tabs, each of which is associated with a particular bin. The scene tabs are shown in FIG. 4 ranging from Scene 13 to Scene 17. The scene tabs may include a brief description of the scene, which may be entered manually into the system or may imported from the script tool 102. Alternatively, the scene tabs do not contain any description therein. In an embodiment, each scene bin, once selected, displays one or more slates associated with the scene bin.
  • As shown in FIG. 4, the Scene 16 bin has been selected, whereby Scene 16 includes several slate tabs having Slates 49-52. Each slate tab is associated to a corresponding slate bin and may include a brief description of the slate, as shown in FIG. 4. This description in each slate tab may be manually entered into the system or may imported from the script tool 102. Alternatively, the slate tabs do not contain any description therein. Each slate bin, once selected, displays one or more takes associated with that slate bin.
  • As shown in FIG. 4, Slate bin 52 is shown to be selected, whereby Slate bin 52 includes several Takes bin tabs having Takes bins 52-1, 52-4 and 52-7. Each Take bin tab is associated with a corresponding Take bin and may include a brief description of the Take, as shown in FIG. 4. Alternatively, the Take tabs do not contain any description therein. Each Take bin tab, once selected, displays one or more shots associated with the take.
  • As shown in FIG. 4, Take bin 52-4 is selected, whereby Take bin 52-4 includes Shot Tabs 52-A1 and 52-B1. Each Shot Tab is associated with a corresponding Shot Bin and may include a brief description of the shot, as shown in FIG. 4. This description in each Shot tab may be manually entered into the system or may imported from the script tool 102. Alternatively, the Shot tabs do not contain any description therein. Each Shot tab, once selected, displays all pertinent data that is associated with the shot. Such data may be carried over from information originally entered in the scene tab (Scene 16), although not necessarily. Such information may include, but is not limited to, video and/or audio clips of the shot, contact information of the actors in the shot, notes, sound and/or visual effects, production budgets, still shots of the scene, location information and other types of information which would be beneficial for the production member. This information may be manually entered or may be imported into the system. In an embodiment, a hyperlink may be included to navigate the user to the script tool 102 to view the portion of the script which refers to the particular scene/shot.
  • The content bins are selectively organizeable in an embodiment. In particular, the organizer tool 104 is configured to allow the user to move the bins and thereby reorganize the structure of the Scenes, Slates, Takes and Shots in any desired manner. For example, the Scene tabs may can be moved to change of the order of where a particular scene is to be located with respect to the other scenes. Thus, Scene 15 in FIG. 4 may be moved to be in between Scenes 13 and 14 by clicking Scene Tab 15 and dragging it between Scenes Tabs 13 and 14. In an embodiment, the movement of the scene tab is noted by the system, whereby the system displays that the scene (e.g. Scene 15) was originally and/or previously between Scenes 13 and 14. In an embodiment, the system is accordingly updated to reflect the change by making a note on the organizer tool 104, version tool 114 and/or by actually moving the text of the script associated with the Scene 15 to be between the text associated with Scenes 13 and 14. In an embodiment, the system automatically updates the numbering so that Scene Tab 15 is renumbered to become Scene Tab 14 when moved to the position after Scene Tab 13.
  • In an embodiment, the user may simply click on a Scene bin in the organizer tool 104 to access all information (e.g. content for all slates, shots, takes) regarding that particular scene at once. In another embodiment, as described above, the user may select a Scene bin and be directed stepwise to a narrower approach of the information in the Scene bin (e.g. view only the available slates in the scene bin). For example, as shown in FIG. 5, upon clicking on Scene Tab 16, the user is able to view a vast array of information that is associated to that particular scene. For example, the user is able to view all the Slates, Takes and Shots associated with Scene 16 using the viewer tool 110 as shown in FIG. 5. The organizer tool 104 shown in FIG. 5 incorporates the viewer tool 110 wherein the user can view video clips, audio clips, graphical and/or textual information assigned to that particular scene. The organizer tool 104 thereby allows the user to easily view, comment on and/or select the takes or shots for each scene by merely selecting the particular Scene tab.
  • In an embodiment, as shown in FIG. 5, the viewer tool 110 is associated with the organizer tool 104 and allows one or more members of the production team to view and collaborate on content in a selected bin, In an embodiment, the viewer tool 110 includes a primary workspace section 502, a Notes Section 504, a secondary workspace section 506, and a scene selector section 508. The video in the primary workspace section 502 is preferably associated with the selected scene (Scene 16 in FIG. 5), although other video or audio content is contemplated. The user is alternatively able to view a still shot, sketch, personnel contact information, the script, notes, or other appropriate information in the area. It should be noted that although the layout of the viewer tool has a particular configuration in FIG. 5, it may have another configuration or layout and is thus not limited thereto.
  • The secondary workspace 506 shown in FIG. 5 is capable of displaying any or all the material that is displayable in the primary workspace 502. The secondary workspace may be smaller in size to the primary workspace 502, although the user can dynamically customize the secondary workspace 506 to be equal or greater in size compared to the primary workspace 502. In an embodiment, the secondary workspace 506 is used to provide additional information to the user while the user is viewing a movie clip or other data.
  • In an embodiment, the user is able to view video files in separate video players in the primary and secondary workspaces 502, 506. For example, the video clips of different takes may be compared in the primary and secondary workspaces. For example the shown in FIG. 5, a particular take may show the subject from one camera angle in the primary workspace 502, whereas a different camera angle from the same take is displayed in the secondary workspace 506. In an embodiment, the timing of the clips in the primary and secondary workspaces are synchronized so that the user is able to view the differences between the two synchronized clips. In an embodiment, a take from one scene may be played in the video player in the primary workspace 502 while another take from the same scene is played within the video player in the secondary workspace 506 or vice versa. For instance, the primary workspace can display a scene where the actor waves with his right hand whereas the user plays a file in the secondary workspace of a different take in which the subject is waving with his left hand. This allows the user to compare the different takes to determine which to use in the cut. In an embodiment, the secondary workspace 506 may also display a list of files associated with the particular scene, whereby the files are from a drop-down menu indicated by reference 510. The user is able to select a file in the list to view it on the primary workspace 502 and/or secondary workspace 506. In an embodiment, the secondary workspace 506 may display the script in the script tool 102 while audio and/or video is played back in the primary workspace 502. The system may highlight portions of the script as the video and/or audio is played in the primary workspace to allow the user to compare the script with what was actually shot in the take. This would require the video and/or audio to be synchronized with the script via a timestamp or other synchronizing method.
  • In the Notes section 504, users can post messages to be shared with all the users who have access to the organizer tool 104 or to one or more particular users who have access to the organizer tool 104. The notes posted may be directed to suggestions or feedback regarding the scene or any other related matter, although not necessarily. As shown in FIG. 5, the Notes section states that the particular moment in the scene should be moved to timestamp 0028+00 The notes may be stored on each user's account, server, or a local computer, whereby the user can remove or highlight one or more notes without affecting the notes on another user's Notes Section. In an embodiment, the notes are uploaded and displayed periodically, although the notes may be updated in real time.
  • In an embodiment, the user can choose the preferred as well as alternative slates, takes and shots for each scene in a sneak preview mode. In the sneak preview mode, the system 100 may then play back the entire motion picture from beginning to end by playing the selected digital audio and video files of the preferred slates, takes, and shots in the order of the scenes (i.e. Scene 1, then Scene 2, then Scene 3, etc.). The system 100 may have a feature to play back the entire or a portion of the motion picture from beginning to end by playing the selected digital audio and video files with selected alternative slates, takes or shots. This allows the user to compare two different versions of the motion picture as a whole. In an embodiment, the system may allow the user to switch between the preferred and an alternate (or vice versa) shots at any time while the system 100 plays back in the sneak preview mode. In an embodiment, the system may allow the viewer to view two or more versions of the motion picture simultaneously to allow the viewer to compare the versions.
  • In an embodiment, digital media is uploaded to the system directly from the movie studio. In an embodiment, digital media is uploaded from members of the production staff and outside sources. The system also is capable of downloading the digital media to diskette, compact disk, flash drives, servers and/or portable or non-portable media playback devices. The digital media files may be stored in a memory on the server 204 or in a separate memory 206, whereby the files in the memory are able to be easily retrieved from the client terminal. In another embodiment, the digital media files are stored on the client application, whereby the system will upload the file automatically or at a designed time. In an embodiment, peer to peer file sharing is performed between terminals for a particular project, whereby the system tracks the source and destinations computers which are sharing the digital file. The digital video files are uploaded in any appropriate format (e.g. avi, mpeg, H.264, etc.). Audio files are uploaded or streamed in an appropriate format as well (mp3, mp4, way, wma, asx, ACC, it etc).
  • Upon uploading the files to the system 100, the system 100 prompts the user as to where the uploaded file is accessible on the system 100. In particular, the system 100 will request whether the file being uploaded is to be a video clip that will be in the motion picture, and if so, which scene, slate, take and shot the clip is to be located in. Upon the user designating the destination of the file, the system 100 stores the file and places a link to the file at the proper location on the system browser.
  • FIG. 6 illustrates a screen shot of another type of layout of the viewer tool 110 in accordance with an embodiment. The viewer tool 110 allows for posting of content for approval and revision requests, whereby members of the production team can view and approve or request revisions quickly and conveniently. The viewer tool 110 may display still shots, video, concept art, storyboards, animatics, motion capture files, models, animation files, audio, and/or other content. The viewer tool 110 is configured to handle the format of the digital content for proper playback.
  • For video, the viewer tool provides the time code (TC) of the content during playback in an embodiment. The time coded material may be presented by the viewer tool 110 in SMPTE format or FF format. In an embodiment, the viewer tool 110 allows the user to place a marker on the playback of the video (or audio), whereby the marker corresponds to the time code content. The marker allows members of the production team to comment on an exact point of content and communicate exactly where in the playback the comment applies. A marker icon (or the actual time stamp) is displayed next to the note in an embodiment. A person reading the note can then select the marker, whereby the viewer tool 110 will playback the content from that exact time stamp.
  • As shown in FIG. 6, the bottom portion of the viewer tool shows a screen that a user might use to place a note to another person. As stated, the note may be posted as in a bulletin board or may be sent as a message. The viewer tool 110 may be a stand alone tool, as shown in FIG. 6.
  • As discussed above, the system 100 includes a spotting tool 112 in accordance with an embodiment. The spotting tool 112 allows dialogue, voice-overs, sound effects and music to be added to the chosen scenes, takes and shots. Dialogue and music may be added at anytime during the production of the motion picture, although dialogue, voiceovers, and music are usually applied in post-production. The audio files are uploaded onto the system 100 by one or more members of the production team, although any other persons may upload the data as well. The uploaded audio files can be linked to the marked time stamp discussed above, whereby playback of the video will automatically playback the audio from the spotting tool 112 in synchronization. The spotting tool 112 allows one or more users to make notes, and thus comment, on the audio files. As with the video, the audio files may be time stamped as well, whereby the time stamp corresponds with the time stamp of the video and/or has a separate time stamp of its own. Therefore, a person can separately mark (and comment) on the audio file in context of the video as well as a portion of the audio file itself.
  • Referring back to FIG. 1, the system 100 includes a version tool 114 in an embodiment, The version tool displays the versions of the project, the tasks, scenes set up, the script, digital and audio files as well as who requested the revision, when the revision was requested, and what the status is of the revision. It is also possible for a user to enter notes about the revision for review by others. These notes can be used for follow up by the production team members.
  • FIG. 7 illustrates a block diagram of the production tracking tool 106 in accordance with an embodiment. As shown in FIG. 7, the production tracking tool 106 enables the management of one or more pending projects by providing the user with the necessary information to track the progress of the project and evaluate yet to-be-completed items. The production tracking tool 106 may be configured to present a one page summary of current, past and future production status information. In an embodiment, the production tracking tool 106 includes a calendar tool 302, a task assignment tool 304, a personnel/contacts tool 306, a note tool 308, and a location tool 310. It should be noted that the production tracking tool 106 may include additional and/or other components which would be useful in the management of a motion picture project. This information, defined herein as profile information, can be linked to the scenes, takes, shots and/or slates, such that the profile information is automatically displayed to the user when reviewing that particular scene, take, shot and/or slate.
  • The system 100 stores the project status information in a database which may be accessed, viewed and/or modified (by authorized users). The production tracking tool 106 allows viewing of the current status of the project, or project history. Every item involved in the project (“production item’) in the database may be accessed by search regarding its current status. As each production item is created by members of the production team, it is stored in the system's memory. The items added to the system may be required to be approved by one or more other members of the production team before it is stored.
  • The calendar 302 in FIG. 7 allows the creation and maintenance of the production schedule of the project. In an embodiment, certain projects in the production of a motion pictures may be broken down and categorized further into sub-projects. These sub-projects may be organized and tracked using the production tracking tool 100. The calendar may provide information in timeline, calendar or other view for both past, ongoing and future tasks. The information in the calendar is able to be updated in real-time to allow members of the production team to view the status of the project. The calendar tool 302 may be configured to automatically notify one or more persons regarding a schedule change. In an embodiment, the calendar tool 302 can display additional information that is not event-based, such as files, contact information, notes, or other profile information as discussed herein. Some examples include placing characters, locations, and script elements on days or weeks for easy access. The calendar tool 302 thus allows items to be placed directly on the calendar versus simply displaying events with items linked to them.
  • The production tracking tool 106 includes a task tool 304, as shown in FIG. 7. The task tool allows one to view past, current and future assignments. The task tool 304 may be configured to assign a particular task to one or more persons, whereby the persons are automatically notified of their assignments. The task tool 304 allows pertinent information to be input including, but not limited to, amount of time to complete the task, desired due date, budgeted cost of performing the task (in terms of cost of labor and materials), etc. Other tasks may be arranged to be dependent on the particular task, whereby the assigned person cannot receive the necessary information to complete any future tasks until the current assigned task is completed. The task tool 304 may include a feature which calculates the amount of time needed to complete several tasks for the assigned person to create an acceptable workload for the person. The task tool 304 may be configured to interface with the calendar 302 to identify any scheduling conflicts. The task tool 304 may configured to interface with the budget tool 116 (FIG. 1) to provide cost information for each task. In an embodiment, the task tool 304 can be configured to include one or more “playlists” which is a group or a series of items in a particular order. The system allows one or more playlists to be sent from one user to another, irrespective of security controls placed on the items in the playlists, whereby the playlist includes a number of tasks and profile information (e.g. images, sounds, video clips, zip archives, pdf documents) in a particular order, such that the receiving person can execute the tasks in the given order to make completing the task easier and more efficient.
  • The personnel/contacts tool 306 shown in FIG. 7 allows the members of the production team to view contact information of all persons who are or may be involved in the project, although not necessarily. Such persons include actors, directors, producers, production engineers, staff, vendors, etc. Specific information as well as multimedia data may be included for each contact in the contact list. For example, digital clips of prior movies which have been done by a particular actor may be included in the actor's profile in the contacts tool 306. This particular feature may allow members of the production team to preview the type of role that the particular actor may be best served in (e.g. support actors, stunt doubles, etc.)
  • The note tool 308 shown in FIG. 7 allows for communication between persons on the production team regarding the project. The note tool 308 facilitates collaboration regarding the project by allowing the users to effectively communicate to one another regarding particular scenes, slates, takes, as well as shots (or aspects thereof) so that necessary adjustments or changes may be made efficiently and quickly, as discussed more below. The note tool 308 may take the form of postings (such as on a bulletin board), although the note tool 308 may include features like instant messaging, email, fax, Internet calls, etc. In an embodiment, the note tool 308 includes verifiable transmission of the note and reception by the intended recipient of the note. The notes are editable and are able to be deleted by authorized users in an embodiment. Although the note tool 308 may be a separate component from the other tools in the system, the note tool 308 may be integrated within one or more other tools (e.g. viewer tool, script tool, etc.).
  • The production tracking tool 106 may also include a location tool 310 as shown in FIG. 7. The location tool 310 may provide information for all the desired locations where the motion picture may be shot, whereby the locations may be indoor or outdoor areas. The location tool 310 may be linked up with the script tool 102 to allow the user to set and/or view the location where the portion of the script should be shot. The location tool 310 may be linked to the personnel/contact tool 306 and tasks tool 304 to assign and organize the persons who will be present at the particular location. The location tool 310 may be linked to the note tool 308 to provide notes and/or additional information of the locations. The location tool 310 may have a field to allow the user to insert the cost of using the location for a desired amount of time, whereby the location tool 310 may be linked to the budget tool 116 and may update the budget accordingly.
  • The budget tool 116 referred to above allows authorized persons in the production team to manage and edit budgeting of the particular project or sub-projects. The budget tool 116 may provide accounting as well as analysis of the budget for the project or sub-projects as well a track and analyze invoice details. The budget tool 116 may include any other features which are found in accounting software programs.
  • FIG. 8 illustrates a block diagram of the administrator tool of the system in accordance with an embodiment. The administrator tool 108 provides all the administrative functions which are used to create and maintain user accounts and logins (block 402), set up and manage security (block 404) and access (block 406), generate reports (block 408) and manage files (block 410). The administrator tool 108 is run by the administrator who has sole internal rights and access to the server 204 (FIG. 2). The administrator tool 108 allows the administrator to modify, add, and delete user groups. Within each user group, the administrator may also add or remove individual users. The administrator tool 108 allows the creation and modification of levels of access of each user.
  • The administrator tool 108 also provides access to reporting features of the system 100, whereby the reporting feature allows convenient access for management staff to track the progress, efficiencies, and status check for the system, as necessary. The reporting feature may include, but is not limited to, a daily operation reports, custom system report, workload reports, employee reports, accounting reports, past due tasks reports, cost/budget reports, and production reports.
  • The administrator tool 108 allows for file management rights in the system. The administrator tool 108 allows the administrator to set and grant uploading and downloading rights to appropriate individuals as well as monitor (and limit) the amount of uploading and downloading permitted by the system 100. The administrator tool 108 tracks movement of all files in the system 100 as well as corrects any problems associated with file management.
  • While embodiments and applications of the system have been shown and described, it would be apparent to those skilled in the art having the benefit of this disclosure that many more modifications than mentioned above are possible without departing from the inventive concepts herein.

Claims (10)

1. A method to provide a user-interface for an organizer tool implemented on a system to organize a script comprising:
at a server:
transmitting a first set of data for displaying on a computer terminal one or more content bins;
receiving from said computer terminal a request to select one of said one or more content bins; and
in response to said request from said computer terminal to select said one of said one or more content bins, transmitting a second set of data for displaying on said computer terminal one or more scene tabs associated with said one of said one or more content bins.
2. The method of claim 1 further comprising:
receiving from said computer terminal a request to select one of said one or more scene tabs; and
in response to said request from said computer terminal to select one of said one or more scene tabs, transmitting a third set of data for displaying on said computer terminal one or more slates associated with said selected one of said one or more scene tabs.
3. The method of claim 2 further comprising:
receiving a request from said computer terminal to change a location of said one of said one or more scene tabs from a first displayed location; and
in response to said request from said computer terminal to change said location of said one of said one or more scene tabs, updating said location to a second displayed location.
4. The method of claim 1, wherein said computer terminal is a mobile device.
5. The method of claim 2, wherein said mobile device is a mobile phone.
6. The method of claim 3 further comprising:
transmitting a fourth set of data for displaying on said computer terminal an indication that said one or more scene tabs was previously located at said first displayed location.
7. The method of claim 6, wherein said indication is a note.
8. The method of claim 1, wherein said one or more bins are configured to link to one or more stored media content files.
9. The method of claim 3, wherein transmitting said first set of data for displaying on said computer terminal said one or more content bins includes transmitting data for displaying said one or more content bins in a tree configuration.
10. The method of claim 9, wherein updating said location to said second displayed location includes updating data that defines how said tree configuration is displayed on said computer terminal.
US13/177,500 2006-06-22 2011-07-06 System And Method For Web Based Collaboration Using Digital Media Abandoned US20120173980A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/177,500 US20120173980A1 (en) 2006-06-22 2011-07-06 System And Method For Web Based Collaboration Using Digital Media

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US81596806P 2006-06-22 2006-06-22
US11/821,276 US8006189B2 (en) 2006-06-22 2007-06-21 System and method for web based collaboration using digital media
US13/177,500 US20120173980A1 (en) 2006-06-22 2011-07-06 System And Method For Web Based Collaboration Using Digital Media

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/821,276 Continuation US8006189B2 (en) 2006-06-22 2007-06-21 System and method for web based collaboration using digital media

Publications (1)

Publication Number Publication Date
US20120173980A1 true US20120173980A1 (en) 2012-07-05

Family

ID=38834155

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/821,276 Active 2029-08-04 US8006189B2 (en) 2006-06-22 2007-06-21 System and method for web based collaboration using digital media
US13/177,500 Abandoned US20120173980A1 (en) 2006-06-22 2011-07-06 System And Method For Web Based Collaboration Using Digital Media

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/821,276 Active 2029-08-04 US8006189B2 (en) 2006-06-22 2007-06-21 System and method for web based collaboration using digital media

Country Status (2)

Country Link
US (2) US8006189B2 (en)
WO (1) WO2007149575A2 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110307527A1 (en) * 2010-06-15 2011-12-15 Jeff Roenning Media Production Application
US8793582B2 (en) * 2012-08-22 2014-07-29 Mobitv, Inc. Personalized timeline presentation
US20140304597A1 (en) * 2013-04-05 2014-10-09 Nbcuniversal Media, Llc Content-object synchronization and authoring of dynamic metadata
CN104199920A (en) * 2014-08-30 2014-12-10 深圳市云来网络科技有限公司 Adaptation method and device for display of web application
US9756549B2 (en) 2014-03-14 2017-09-05 goTenna Inc. System and method for digital communication between computing devices
WO2019140120A1 (en) * 2018-01-11 2019-07-18 End Cue, Llc Script writing and content generation tools and improved operation of same
US10896294B2 (en) 2018-01-11 2021-01-19 End Cue, Llc Script writing and content generation tools and improved operation of same

Families Citing this family (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8266214B2 (en) * 2006-01-24 2012-09-11 Simulat, Inc. System and method for collaborative web-based multimedia layered platform with recording and selective playback of content
US8006189B2 (en) * 2006-06-22 2011-08-23 Dachs Eric B System and method for web based collaboration using digital media
US20080033735A1 (en) * 2006-07-20 2008-02-07 William Bryan Graham System and method for recording and distributing audio renditions of scripts
US20080092047A1 (en) * 2006-10-12 2008-04-17 Rideo, Inc. Interactive multimedia system and method for audio dubbing of video
EP2137645A1 (en) * 2007-04-13 2009-12-30 Thomson Licensing System and method for mapping logical and physical assets in a user interface
US20080270406A1 (en) * 2007-04-27 2008-10-30 International Business Machines Corporation System and method for adding comments to knowledge documents and expediting formal authoring of content
EP2324417A4 (en) * 2008-07-08 2012-01-11 Sceneplay Inc Media generating system and method
FR2946823B1 (en) * 2009-06-10 2011-11-25 Borrego Films METHOD FOR GENERATING AND MANAGING A MULTIMEDIA SEQUENCE MODEL, CORRESPONDING METHOD AND RESTITUTION DEVICE
US10636413B2 (en) 2009-06-13 2020-04-28 Rolr, Inc. System for communication skills training using juxtaposition of recorded takes
JP5503738B2 (en) 2009-06-13 2014-05-28 ロールスター インコーポレイテッド System for juxtaposing scenes recorded separately
US20110026899A1 (en) 2009-07-31 2011-02-03 Paul Lussier Systems and Methods for Viewing and Editing Content Over a Computer Network in Multiple Formats and Resolutions
US20110047484A1 (en) * 2009-08-19 2011-02-24 Onehub Inc. User manageable collaboration
US8867901B2 (en) * 2010-02-05 2014-10-21 Theatrics. com LLC Mass participation movies
US20110219308A1 (en) * 2010-03-02 2011-09-08 Twentieth Century Fox Film Corporation Pre-processing and encoding media content
US10264305B2 (en) * 2010-03-02 2019-04-16 Twentieth Century Fox Film Corporation Delivery of encoded media content
US8949782B2 (en) * 2010-12-03 2015-02-03 Adobe Systems Incorporated Enhanced timelines in application development environments
US20120151345A1 (en) * 2010-12-10 2012-06-14 Mcclements Iv James Burns Recognition lookups for synchronization of media playback with comment creation and delivery
US9201571B2 (en) 2011-01-06 2015-12-01 It's Relevant, LLC Logging, editing and production system for activities of local interest and related video
US20130227416A1 (en) * 2011-01-06 2013-08-29 Edward Massena Device for logging, editing and production of video programs for activities of local interest
US20120197809A1 (en) * 2011-01-29 2012-08-02 Charles Calvin Earl Method and System for Automated Construction of Project Teams
US20120263439A1 (en) * 2011-04-13 2012-10-18 David King Lassman Method and apparatus for creating a composite video from multiple sources
US20120308195A1 (en) * 2011-05-31 2012-12-06 Michael Bannan Feedback system and method
US20130151970A1 (en) * 2011-06-03 2013-06-13 Maha Achour System and Methods for Distributed Multimedia Production
US9003287B2 (en) * 2011-11-18 2015-04-07 Lucasfilm Entertainment Company Ltd. Interaction between 3D animation and corresponding script
US8989495B2 (en) 2012-02-28 2015-03-24 Sony Corporation Capturing metadata on set using a smart pen
US20130339896A1 (en) * 2012-06-01 2013-12-19 Sas Ip User interface and method of data navigation in the user interface of engineering analysis applications
US9237119B1 (en) * 2012-07-17 2016-01-12 Green Room Networks, Inc. File-attendant messaging
US9160605B1 (en) * 2012-09-20 2015-10-13 Amazon Technologies, Inc. Distributing data to groups of user devices
US20140245369A1 (en) * 2013-02-26 2014-08-28 Splenvid, Inc. Automated movie compilation system
WO2015054342A1 (en) 2013-10-09 2015-04-16 Mindset Systems Method of and system for automatic compilation of crowdsourced digital media productions
US9996537B2 (en) * 2014-06-19 2018-06-12 Storymatik Software, S.A. Systems and methods for automatic narrative creation
AU2015224398A1 (en) * 2015-09-08 2017-03-23 Canon Kabushiki Kaisha A method for presenting notifications when annotations are received from a remote device
US10303979B2 (en) 2016-11-16 2019-05-28 Phenomic Ai Inc. System and method for classifying and segmenting microscopy images with deep multiple instance learning
US11317028B2 (en) * 2017-01-06 2022-04-26 Appsure Inc. Capture and display device
CN107169742A (en) * 2017-05-23 2017-09-15 首汇焦点(北京)科技有限公司 Management method and system that a kind of film-making is planned as a whole
US10057537B1 (en) 2017-08-18 2018-08-21 Prime Focus Technologies, Inc. System and method for source script and video synchronization interface
US10834478B2 (en) * 2017-12-29 2020-11-10 Dish Network L.L.C. Methods and systems for an augmented film crew using purpose
US10783925B2 (en) * 2017-12-29 2020-09-22 Dish Network L.L.C. Methods and systems for an augmented film crew using storyboards
US10453496B2 (en) * 2017-12-29 2019-10-22 Dish Network L.L.C. Methods and systems for an augmented film crew using sweet spots
US11107020B2 (en) * 2019-03-15 2021-08-31 Microsoft Technology Licensing, Llc Intelligent task suggestions based on automated learning and contextual analysis of user activity
EP3984235A4 (en) 2019-06-11 2022-12-28 Wemovie Technologies Production-as-service systems for making movies, tv shows and multimedia contents
WO2021022499A1 (en) 2019-08-07 2021-02-11 WeMovie Technologies Adaptive marketing in cloud-based content production
WO2021068105A1 (en) 2019-10-08 2021-04-15 WeMovie Technologies Pre-production systems for making movies, tv shows and multimedia contents
WO2021225608A1 (en) * 2020-05-08 2021-11-11 WeMovie Technologies Fully automated post-production editing for movies, tv shows and multimedia contents
US11682152B1 (en) 2020-07-16 2023-06-20 Iscribble, Inc. Collaborative art and communication platform
US11070888B1 (en) 2020-08-27 2021-07-20 WeMovie Technologies Content structure aware multimedia streaming service for movies, TV shows and multimedia contents
US11812121B2 (en) 2020-10-28 2023-11-07 WeMovie Technologies Automated post-production editing for user-generated multimedia contents
US20220150294A1 (en) * 2020-11-10 2022-05-12 At&T Intellectual Property I, L.P. System for socially shared and opportunistic content creation
KR102298159B1 (en) * 2020-12-15 2021-09-06 김석환 Method and system for online contents registration and transaction based on user active selection
US11330154B1 (en) 2021-07-23 2022-05-10 WeMovie Technologies Automated coordination in multimedia content production
US11631433B2 (en) * 2021-08-10 2023-04-18 International Business Machines Corporation Optimized video segmentation for completing tasks
US11321639B1 (en) 2021-12-13 2022-05-03 WeMovie Technologies Automated evaluation of acting performance using cloud services
US11538500B1 (en) * 2022-01-31 2022-12-27 Omgiva, LLC Distributed video creation

Citations (63)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5191645A (en) * 1991-02-28 1993-03-02 Sony Corporation Of America Digital signal processing system employing icon displays
US5412773A (en) * 1991-11-19 1995-05-02 Sony Electronics Inc. Computerized interactive menu-driven video signal processing apparatus and method
US5659793A (en) * 1994-12-22 1997-08-19 Bell Atlantic Video Services, Inc. Authoring tools for multimedia application development and network delivery
US5675748A (en) * 1993-12-21 1997-10-07 Object Technology Licensing Corp. Method and apparatus for automatically configuring computer system hardware and software
US5801707A (en) * 1996-07-19 1998-09-01 Motorola, Inc. Method and apparatus for displaying hierarchical data associated with components of a system
US5821945A (en) * 1995-02-03 1998-10-13 The Trustees Of Princeton University Method and apparatus for video browsing based on content and structure
US5852435A (en) * 1996-04-12 1998-12-22 Avid Technology, Inc. Digital multimedia editing and data management system
US5892535A (en) * 1996-05-08 1999-04-06 Digital Video Systems, Inc. Flexible, configurable, hierarchical system for distributing programming
US5905841A (en) * 1992-07-01 1999-05-18 Avid Technology, Inc. Electronic film editing system using both film and videotape format
US5907704A (en) * 1995-04-03 1999-05-25 Quark, Inc. Hierarchical encapsulation of instantiated objects in a multimedia authoring system including internet accessible objects
US6144962A (en) * 1996-10-15 2000-11-07 Mercury Interactive Corporation Visualization of web sites and hierarchical data structures
US6154601A (en) * 1996-04-12 2000-11-28 Hitachi Denshi Kabushiki Kaisha Method for editing image information with aid of computer and editing system
US6271845B1 (en) * 1998-05-29 2001-08-07 Hewlett Packard Company Method and structure for dynamically drilling down through a health monitoring map to determine the health status and cause of health problems associated with network objects of a managed network environment
US6278464B1 (en) * 1997-03-07 2001-08-21 Silicon Graphics, Inc. Method, system, and computer program product for visualizing a decision-tree classifier
US20010036356A1 (en) * 2000-04-07 2001-11-01 Autodesk, Inc. Non-linear video editing system
US20020022986A1 (en) * 1998-11-30 2002-02-21 Coker John L. Smart scripting call centers
US20020186485A1 (en) * 2001-05-12 2002-12-12 Lg Electronics Inc. Recording medium containing moving picture data and additional information thereof and reproducing method and apparatus of the recording medium
US20030078973A1 (en) * 2001-09-25 2003-04-24 Przekop Michael V. Web-enabled system and method for on-demand distribution of transcript-synchronized video/audio records of legal proceedings to collaborative workgroups
US20040001079A1 (en) * 2002-07-01 2004-01-01 Bin Zhao Video editing GUI with layer view
US20040059996A1 (en) * 2002-09-24 2004-03-25 Fasciano Peter J. Exhibition of digital media assets from a digital media asset management system to facilitate creative story generation
US20040143598A1 (en) * 2003-01-21 2004-07-22 Drucker Steven M. Media frame object visualization system
US20040143604A1 (en) * 2003-01-21 2004-07-22 Steve Glenner Random access editing of media
US20040143590A1 (en) * 2003-01-21 2004-07-22 Wong Curtis G. Selection bins
US20040172593A1 (en) * 2003-01-21 2004-09-02 Curtis G. Wong Rapid media group annotation
US20040220926A1 (en) * 2000-01-03 2004-11-04 Interactual Technologies, Inc., A California Cpr[P Personalization services for entities from multiple sources
US20040220791A1 (en) * 2000-01-03 2004-11-04 Interactual Technologies, Inc. A California Corpor Personalization services for entities from multiple sources
US20040230572A1 (en) * 2001-06-22 2004-11-18 Nosa Omoigui System and method for semantic knowledge retrieval, management, capture, sharing, discovery, delivery and presentation
US20040236616A1 (en) * 1999-11-01 2004-11-25 Ita Software, Inc., A Massachusetts Corporation Graphical user interface for travel planning system
US20040268413A1 (en) * 2003-05-29 2004-12-30 Reid Duane M. System for presentation of multimedia content
US20050005246A1 (en) * 2000-12-21 2005-01-06 Xerox Corporation Navigation methods, systems, and computer program products for virtual three-dimensional books
US20050022254A1 (en) * 2003-07-01 2005-01-27 Dirk Adolph Method and apparatus for editing a data stream
US20050033747A1 (en) * 2003-05-25 2005-02-10 Erland Wittkotter Apparatus and method for the server-sided linking of information
US20050163462A1 (en) * 2004-01-28 2005-07-28 Pratt Buell A. Motion picture asset archive having reduced physical volume and method
US20050171964A1 (en) * 1999-05-21 2005-08-04 Kulas Charles J. Creation and playback of computer-generated productions using script-controlled rendering engines
US20060026655A1 (en) * 2004-07-30 2006-02-02 Perez Milton D System and method for managing, converting and displaying video content on a video-on-demand platform, including ads used for drill-down navigation and consumer-generated classified ads
US6995768B2 (en) * 2000-05-10 2006-02-07 Cognos Incorporated Interactive business data visualization system
US7102644B2 (en) * 1995-12-11 2006-09-05 Apple Computer, Inc. Apparatus and method for storing a movie within a movie
US7124366B2 (en) * 1996-07-29 2006-10-17 Avid Technology, Inc. Graphical user interface for a motion video planning and editing system for a computer
US7143100B2 (en) * 2001-06-13 2006-11-28 Mci, Llc Method, system and program product for viewing and manipulating graphical objects representing hierarchically arranged elements of a modeled environment
US20060277454A1 (en) * 2003-12-09 2006-12-07 Yi-Chih Chen Multimedia presentation system
US7197491B1 (en) * 1999-09-21 2007-03-27 International Business Machines Corporation Architecture and implementation of a dynamic RMI server configuration hierarchy to support federated search and update across heterogeneous datastores
US7200320B1 (en) * 2001-11-13 2007-04-03 Denecke, Inc. Time code slate system and method
US20070204211A1 (en) * 2006-02-24 2007-08-30 Paxson Dana W Apparatus and method for creating literary macrames
US20070239787A1 (en) * 2006-04-10 2007-10-11 Yahoo! Inc. Video generation based on aggregate user data
US20070260690A1 (en) * 2004-09-27 2007-11-08 David Coleman Method and Apparatus for Remote Voice-Over or Music Production and Management
US20080010585A1 (en) * 2003-09-26 2008-01-10 Fuji Xerox Co., Ltd. Binding interactive multichannel digital document system and authoring tool
US20080010601A1 (en) * 2006-06-22 2008-01-10 Dachs Eric B System and method for web based collaboration using digital media
US20080037879A1 (en) * 2006-07-25 2008-02-14 Paxson Dana W Method and apparatus for electronic literary macrame component referencing
US7334197B2 (en) * 2000-07-19 2008-02-19 Microsoft Corporation Display and management of data within hierarchies and polyarchies of information
US20080195608A1 (en) * 2004-12-30 2008-08-14 Lina Clover Computer-Implemented System And Method For Visualizing OLAP And Multidimensional Data In A Calendar Format
US20080195932A1 (en) * 2002-05-24 2008-08-14 Kazushige Oikawa Method and apparatus for re-editing and redistributing web documents
US20080222504A1 (en) * 2007-02-26 2008-09-11 Nokia Corporation Script-based system to perform dynamic updates to rich media content and services
US20090024963A1 (en) * 2007-07-19 2009-01-22 Apple Inc. Script-integrated storyboards
US20090094039A1 (en) * 2007-10-04 2009-04-09 Zhura Corporation Collaborative production of rich media content
US7660416B1 (en) * 2005-01-11 2010-02-09 Sample Digital Holdings Llc System and method for media content collaboration throughout a media production process
US20100083077A1 (en) * 2004-02-06 2010-04-01 Sequoia Media Group, Lc Automated multimedia object models
US20100322589A1 (en) * 2007-06-29 2010-12-23 Russell Henderson Non sequential automated production by self-interview kit of a video based on user generated multimedia content
US7899915B2 (en) * 2002-05-10 2011-03-01 Richard Reisman Method and apparatus for browsing using multiple coordinated device sets
US8037108B1 (en) * 2009-07-22 2011-10-11 Adobe Systems Incorporated Conversion of relational databases into triplestores
US20120198412A1 (en) * 2005-04-19 2012-08-02 Oliver Creighton Software cinema
US8266283B2 (en) * 2004-03-18 2012-09-11 Andrew Liebman Media file access and storage solution for multi-workstation/multi-platform non-linear video editing systems
US8370869B2 (en) * 1998-11-06 2013-02-05 The Trustees Of Columbia University In The City Of New York Video description system and method
US20130254158A1 (en) * 2011-09-14 2013-09-26 02 Filmes E Videos Ltda Literary object organization

Family Cites Families (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5671428A (en) * 1991-08-28 1997-09-23 Kabushiki Kaisha Toshiba Collaborative document processing system with version and comment management
US6948070B1 (en) 1995-02-13 2005-09-20 Intertrust Technologies Corporation Systems and methods for secure transaction management and electronic rights protection
US6786420B1 (en) 1997-07-15 2004-09-07 Silverbrook Research Pty. Ltd. Data distribution mechanism in the form of ink dots on cards
US6553404B2 (en) 1997-08-08 2003-04-22 Prn Corporation Digital system
EP1002274B1 (en) 1997-08-08 2014-04-02 Thomson Licensing Digital department system
US6957186B1 (en) 1999-05-27 2005-10-18 Accenture Llp System method and article of manufacture for building, managing, and supporting various components of a system
US6598074B1 (en) * 1999-09-23 2003-07-22 Rocket Network, Inc. System and method for enabling multimedia production collaboration over a network
AU2753901A (en) 1999-12-31 2001-07-16 Graham L. Mahin System and method for automated script-based production
WO2001052526A2 (en) 2000-01-14 2001-07-19 Parkervision, Inc. System and method for real time video production
NL1015363C2 (en) 2000-02-29 2001-08-30 Richard Hendricus Johannes Van Method and system for making audio and / or video files available.
JP2001290938A (en) 2000-03-24 2001-10-19 Trw Inc Integrated digital production line for full-motion visual product
US7143357B1 (en) * 2000-05-18 2006-11-28 Vulcan Portals, Inc. System and methods for collaborative digital media development
JP2001351116A (en) 2000-06-07 2001-12-21 Sony Corp Electronic animation comic providing system, electronic information generating device, information processing apparatus, recording medium and electronic animation comic providing method
US20050068462A1 (en) 2000-08-10 2005-03-31 Harris Helen J. Process for associating and delivering data with visual media
US20060015904A1 (en) 2000-09-08 2006-01-19 Dwight Marcus Method and apparatus for creation, distribution, assembly and verification of media
AU2001288079A1 (en) 2000-11-02 2002-05-15 Fujiyama Co., Ltd. Distribution system of digital image content and reproducing method and medium recording its reproduction program
US6897880B2 (en) 2001-02-22 2005-05-24 Sony Corporation User interface for generating parameter values in media presentations based on selected presentation instances
TWI220036B (en) 2001-05-10 2004-08-01 Ibm System and method for enhancing broadcast or recorded radio or television programs with information on the world wide web
TWI256250B (en) 2001-05-10 2006-06-01 Ibm System and method for enhancing recorded radio or television programs with information on the world wide web
AU2002355530A1 (en) 2001-08-03 2003-02-24 John Allen Ananian Personalized interactive digital catalog profiling
US7603626B2 (en) * 2001-09-10 2009-10-13 Disney Enterprises, Inc. Method and system for creating a collaborative work over a digital network
US7432940B2 (en) 2001-10-12 2008-10-07 Canon Kabushiki Kaisha Interactive animation of sprites in a video production
US20030158872A1 (en) 2002-02-19 2003-08-21 Media Vu, Llc Method and system for checking content before dissemination
JP4218264B2 (en) 2002-06-25 2009-02-04 ソニー株式会社 Content creation system, content plan creation program, program recording medium, imaging device, imaging method, imaging program
US7885887B2 (en) 2002-07-09 2011-02-08 Artistshare, Inc. Methods and apparatuses for financing and marketing a creative work
US7613630B2 (en) 2002-10-17 2009-11-03 Automated Media Services, Inc. System and method for editing existing footage to generate and distribute advertising content to retail locations
US7039655B2 (en) 2003-04-07 2006-05-02 Mesoft Partners, Llc System and method for providing a digital media supply chain operation system and suite of applications
US20050010874A1 (en) * 2003-07-07 2005-01-13 Steven Moder Virtual collaborative editing room
US20050071736A1 (en) 2003-09-26 2005-03-31 Fuji Xerox Co., Ltd. Comprehensive and intuitive media collection and management tool
US20050084082A1 (en) * 2003-10-15 2005-04-21 Microsoft Corporation Designs, interfaces, and policies for systems that enhance communication and minimize disruption by encoding preferences and situations
US20050165840A1 (en) 2004-01-28 2005-07-28 Pratt Buell A. Method and apparatus for improved access to a compacted motion picture asset archive
US20050173864A1 (en) * 2004-02-10 2005-08-11 Yongjun Zhao Authorship cooperative system
US20050187806A1 (en) 2004-02-20 2005-08-25 Idt Corporation Global animation studio
US8019194B2 (en) 2004-04-05 2011-09-13 S. two Corp. Digital audio and video recording and storage system and method
CA2509092A1 (en) 2004-06-03 2005-12-03 Casting Workbook Services Inc. Method and system for creating, tracking, casting and reporting on moving image projects
US20060047547A1 (en) 2004-08-25 2006-03-02 Ekker Jon D System and method for automated product design and approval
US20060064644A1 (en) * 2004-09-20 2006-03-23 Joo Jin W Short-term filmmaking event administered over an electronic communication network
US20060064731A1 (en) 2004-09-20 2006-03-23 Mitch Kahle System and method for automated production of personalized videos on digital media of individual participants in large events
US20060218476A1 (en) * 2005-03-25 2006-09-28 Xerox Corporation Collaborative document authoring and production methods and systems
US8826136B2 (en) * 2005-06-27 2014-09-02 Core Wireless Licensing S.A.R.L. System and method for enabling collaborative media stream editing
WO2007035317A2 (en) * 2005-09-16 2007-03-29 Snapse, Inc. System and method for providing a media content exchange
US20070118801A1 (en) * 2005-11-23 2007-05-24 Vizzme, Inc. Generation and playback of multimedia presentations
US20090196570A1 (en) * 2006-01-05 2009-08-06 Eyesopt Corporation System and methods for online collaborative video creation
US20070162854A1 (en) * 2006-01-12 2007-07-12 Dan Kikinis System and Method for Interactive Creation of and Collaboration on Video Stories
US20070239839A1 (en) * 2006-04-06 2007-10-11 Buday Michael E Method for multimedia review synchronization
US10380231B2 (en) * 2006-05-24 2019-08-13 International Business Machines Corporation System and method for dynamic organization of information sets
US20070282614A1 (en) * 2006-05-29 2007-12-06 Christian Dreke A method for collaborative creation of artwork by a large plurality of users using a communications network

Patent Citations (73)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5191645A (en) * 1991-02-28 1993-03-02 Sony Corporation Of America Digital signal processing system employing icon displays
US5412773A (en) * 1991-11-19 1995-05-02 Sony Electronics Inc. Computerized interactive menu-driven video signal processing apparatus and method
US5905841A (en) * 1992-07-01 1999-05-18 Avid Technology, Inc. Electronic film editing system using both film and videotape format
US20040057696A1 (en) * 1992-07-01 2004-03-25 Peters Eric C. Electronic film editing system using both film and videotape format
US5675748A (en) * 1993-12-21 1997-10-07 Object Technology Licensing Corp. Method and apparatus for automatically configuring computer system hardware and software
US5659793A (en) * 1994-12-22 1997-08-19 Bell Atlantic Video Services, Inc. Authoring tools for multimedia application development and network delivery
US5821945A (en) * 1995-02-03 1998-10-13 The Trustees Of Princeton University Method and apparatus for video browsing based on content and structure
US5907704A (en) * 1995-04-03 1999-05-25 Quark, Inc. Hierarchical encapsulation of instantiated objects in a multimedia authoring system including internet accessible objects
US7102644B2 (en) * 1995-12-11 2006-09-05 Apple Computer, Inc. Apparatus and method for storing a movie within a movie
US5852435A (en) * 1996-04-12 1998-12-22 Avid Technology, Inc. Digital multimedia editing and data management system
US6154601A (en) * 1996-04-12 2000-11-28 Hitachi Denshi Kabushiki Kaisha Method for editing image information with aid of computer and editing system
US5892535A (en) * 1996-05-08 1999-04-06 Digital Video Systems, Inc. Flexible, configurable, hierarchical system for distributing programming
US5801707A (en) * 1996-07-19 1998-09-01 Motorola, Inc. Method and apparatus for displaying hierarchical data associated with components of a system
US7124366B2 (en) * 1996-07-29 2006-10-17 Avid Technology, Inc. Graphical user interface for a motion video planning and editing system for a computer
US6144962A (en) * 1996-10-15 2000-11-07 Mercury Interactive Corporation Visualization of web sites and hierarchical data structures
US6278464B1 (en) * 1997-03-07 2001-08-21 Silicon Graphics, Inc. Method, system, and computer program product for visualizing a decision-tree classifier
US20020054169A1 (en) * 1998-05-29 2002-05-09 Richardson David E. Method and apparatus for dynamically drilling-down through a health monitoring map to determine the health status and cause of health problems associated with network objects of a managed network environment
US6271845B1 (en) * 1998-05-29 2001-08-07 Hewlett Packard Company Method and structure for dynamically drilling down through a health monitoring map to determine the health status and cause of health problems associated with network objects of a managed network environment
US8370869B2 (en) * 1998-11-06 2013-02-05 The Trustees Of Columbia University In The City Of New York Video description system and method
US20020022986A1 (en) * 1998-11-30 2002-02-21 Coker John L. Smart scripting call centers
US6826745B2 (en) * 1998-11-30 2004-11-30 Siebel Systems, Inc. System and method for smart scripting call centers and configuration thereof
US20050171964A1 (en) * 1999-05-21 2005-08-04 Kulas Charles J. Creation and playback of computer-generated productions using script-controlled rendering engines
US6947044B1 (en) * 1999-05-21 2005-09-20 Kulas Charles J Creation and playback of computer-generated productions using script-controlled rendering engines
US7197491B1 (en) * 1999-09-21 2007-03-27 International Business Machines Corporation Architecture and implementation of a dynamic RMI server configuration hierarchy to support federated search and update across heterogeneous datastores
US20040236616A1 (en) * 1999-11-01 2004-11-25 Ita Software, Inc., A Massachusetts Corporation Graphical user interface for travel planning system
US20040220926A1 (en) * 2000-01-03 2004-11-04 Interactual Technologies, Inc., A California Cpr[P Personalization services for entities from multiple sources
US20040220791A1 (en) * 2000-01-03 2004-11-04 Interactual Technologies, Inc. A California Corpor Personalization services for entities from multiple sources
US20010036356A1 (en) * 2000-04-07 2001-11-01 Autodesk, Inc. Non-linear video editing system
US6995768B2 (en) * 2000-05-10 2006-02-07 Cognos Incorporated Interactive business data visualization system
US7334197B2 (en) * 2000-07-19 2008-02-19 Microsoft Corporation Display and management of data within hierarchies and polyarchies of information
US20050005246A1 (en) * 2000-12-21 2005-01-06 Xerox Corporation Navigation methods, systems, and computer program products for virtual three-dimensional books
US20020186485A1 (en) * 2001-05-12 2002-12-12 Lg Electronics Inc. Recording medium containing moving picture data and additional information thereof and reproducing method and apparatus of the recording medium
US7143100B2 (en) * 2001-06-13 2006-11-28 Mci, Llc Method, system and program product for viewing and manipulating graphical objects representing hierarchically arranged elements of a modeled environment
US20040230572A1 (en) * 2001-06-22 2004-11-18 Nosa Omoigui System and method for semantic knowledge retrieval, management, capture, sharing, discovery, delivery and presentation
US20030078973A1 (en) * 2001-09-25 2003-04-24 Przekop Michael V. Web-enabled system and method for on-demand distribution of transcript-synchronized video/audio records of legal proceedings to collaborative workgroups
US7200320B1 (en) * 2001-11-13 2007-04-03 Denecke, Inc. Time code slate system and method
US7899915B2 (en) * 2002-05-10 2011-03-01 Richard Reisman Method and apparatus for browsing using multiple coordinated device sets
US20080195932A1 (en) * 2002-05-24 2008-08-14 Kazushige Oikawa Method and apparatus for re-editing and redistributing web documents
US20040001079A1 (en) * 2002-07-01 2004-01-01 Bin Zhao Video editing GUI with layer view
US20040059996A1 (en) * 2002-09-24 2004-03-25 Fasciano Peter J. Exhibition of digital media assets from a digital media asset management system to facilitate creative story generation
US7509321B2 (en) * 2003-01-21 2009-03-24 Microsoft Corporation Selection bins for browsing, annotating, sorting, clustering, and filtering media objects
US20040172593A1 (en) * 2003-01-21 2004-09-02 Curtis G. Wong Rapid media group annotation
US7117453B2 (en) * 2003-01-21 2006-10-03 Microsoft Corporation Media frame object visualization system
US20040143590A1 (en) * 2003-01-21 2004-07-22 Wong Curtis G. Selection bins
US20040143604A1 (en) * 2003-01-21 2004-07-22 Steve Glenner Random access editing of media
US20040143598A1 (en) * 2003-01-21 2004-07-22 Drucker Steven M. Media frame object visualization system
US20050033747A1 (en) * 2003-05-25 2005-02-10 Erland Wittkotter Apparatus and method for the server-sided linking of information
US20040268413A1 (en) * 2003-05-29 2004-12-30 Reid Duane M. System for presentation of multimedia content
US7653284B2 (en) * 2003-07-01 2010-01-26 Thomson Licensing Method and apparatus for editing a data stream
US20050022254A1 (en) * 2003-07-01 2005-01-27 Dirk Adolph Method and apparatus for editing a data stream
US20080010585A1 (en) * 2003-09-26 2008-01-10 Fuji Xerox Co., Ltd. Binding interactive multichannel digital document system and authoring tool
US20060277454A1 (en) * 2003-12-09 2006-12-07 Yi-Chih Chen Multimedia presentation system
US20050163462A1 (en) * 2004-01-28 2005-07-28 Pratt Buell A. Motion picture asset archive having reduced physical volume and method
US20100083077A1 (en) * 2004-02-06 2010-04-01 Sequoia Media Group, Lc Automated multimedia object models
US8266283B2 (en) * 2004-03-18 2012-09-11 Andrew Liebman Media file access and storage solution for multi-workstation/multi-platform non-linear video editing systems
US20060026655A1 (en) * 2004-07-30 2006-02-02 Perez Milton D System and method for managing, converting and displaying video content on a video-on-demand platform, including ads used for drill-down navigation and consumer-generated classified ads
US20070260690A1 (en) * 2004-09-27 2007-11-08 David Coleman Method and Apparatus for Remote Voice-Over or Music Production and Management
US7592532B2 (en) * 2004-09-27 2009-09-22 Soundstreak, Inc. Method and apparatus for remote voice-over or music production and management
US20080195608A1 (en) * 2004-12-30 2008-08-14 Lina Clover Computer-Implemented System And Method For Visualizing OLAP And Multidimensional Data In A Calendar Format
US7660416B1 (en) * 2005-01-11 2010-02-09 Sample Digital Holdings Llc System and method for media content collaboration throughout a media production process
US20120198412A1 (en) * 2005-04-19 2012-08-02 Oliver Creighton Software cinema
US7810021B2 (en) * 2006-02-24 2010-10-05 Paxson Dana W Apparatus and method for creating literary macramés
US20070204211A1 (en) * 2006-02-24 2007-08-30 Paxson Dana W Apparatus and method for creating literary macrames
US20070239787A1 (en) * 2006-04-10 2007-10-11 Yahoo! Inc. Video generation based on aggregate user data
US20080010601A1 (en) * 2006-06-22 2008-01-10 Dachs Eric B System and method for web based collaboration using digital media
US20080037879A1 (en) * 2006-07-25 2008-02-14 Paxson Dana W Method and apparatus for electronic literary macrame component referencing
US20080222504A1 (en) * 2007-02-26 2008-09-11 Nokia Corporation Script-based system to perform dynamic updates to rich media content and services
US8621354B2 (en) * 2007-06-29 2013-12-31 Russell Henderson Non sequential automated production by self-interview kit of a video based on user generated multimedia content
US20100322589A1 (en) * 2007-06-29 2010-12-23 Russell Henderson Non sequential automated production by self-interview kit of a video based on user generated multimedia content
US20090024963A1 (en) * 2007-07-19 2009-01-22 Apple Inc. Script-integrated storyboards
US20090094039A1 (en) * 2007-10-04 2009-04-09 Zhura Corporation Collaborative production of rich media content
US8037108B1 (en) * 2009-07-22 2011-10-11 Adobe Systems Incorporated Conversion of relational databases into triplestores
US20130254158A1 (en) * 2011-09-14 2013-09-26 02 Filmes E Videos Ltda Literary object organization

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Monahan (see "The Browser_The Organizing Barin of FCP" by Monahan in http://www.kenstone.net/fcp_homepage/basic_browser.html; posted on 11/5/2001) *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8583605B2 (en) * 2010-06-15 2013-11-12 Apple Inc. Media production application
US20110307527A1 (en) * 2010-06-15 2011-12-15 Jeff Roenning Media Production Application
US10346009B2 (en) 2012-08-22 2019-07-09 Mobitv, Inc. Personalized timeline presentation
US8793582B2 (en) * 2012-08-22 2014-07-29 Mobitv, Inc. Personalized timeline presentation
US10831353B2 (en) 2012-08-22 2020-11-10 Mobitv, Inc. Personalized timeline presentation
US9715334B2 (en) 2012-08-22 2017-07-25 Mobitv, Inc. Personalized timeline presentation
US10346008B2 (en) 2012-08-22 2019-07-09 Mobitv, Inc. Personalized timeline presentation
US20140304597A1 (en) * 2013-04-05 2014-10-09 Nbcuniversal Media, Llc Content-object synchronization and authoring of dynamic metadata
US9788084B2 (en) * 2013-04-05 2017-10-10 NBCUniversal, LLC Content-object synchronization and authoring of dynamic metadata
US10602424B2 (en) 2014-03-14 2020-03-24 goTenna Inc. System and method for digital communication between computing devices
US10015720B2 (en) 2014-03-14 2018-07-03 GoTenna, Inc. System and method for digital communication between computing devices
US9756549B2 (en) 2014-03-14 2017-09-05 goTenna Inc. System and method for digital communication between computing devices
CN104199920A (en) * 2014-08-30 2014-12-10 深圳市云来网络科技有限公司 Adaptation method and device for display of web application
WO2019140120A1 (en) * 2018-01-11 2019-07-18 End Cue, Llc Script writing and content generation tools and improved operation of same
US10896294B2 (en) 2018-01-11 2021-01-19 End Cue, Llc Script writing and content generation tools and improved operation of same
US10922489B2 (en) 2018-01-11 2021-02-16 RivetAI, Inc. Script writing and content generation tools and improved operation of same

Also Published As

Publication number Publication date
US20080010601A1 (en) 2008-01-10
US8006189B2 (en) 2011-08-23
WO2007149575A2 (en) 2007-12-27
WO2007149575A3 (en) 2008-04-03

Similar Documents

Publication Publication Date Title
US8006189B2 (en) System and method for web based collaboration using digital media
US10592075B1 (en) System and method for media content collaboration throughout a media production process
US10936168B2 (en) Media presentation generating system and method using recorded splitscenes
US20180330756A1 (en) Method and apparatus for creating and automating new video works
US20070118801A1 (en) Generation and playback of multimedia presentations
US8341525B1 (en) System and methods for collaborative online multimedia production
US10706888B2 (en) Methods and systems for creating, combining, and sharing time-constrained videos
US20130151970A1 (en) System and Methods for Distributed Multimedia Production
US8910051B2 (en) Systems and methods for content aggregation, editing and delivery
US20140310746A1 (en) Digital asset management, authoring, and presentation techniques
US20050071736A1 (en) Comprehensive and intuitive media collection and management tool
US20080010585A1 (en) Binding interactive multichannel digital document system and authoring tool
US20060064644A1 (en) Short-term filmmaking event administered over an electronic communication network
CN103988496A (en) Method and apparatus for creating composite video from multiple sources
US20140282087A1 (en) System and Methods for Facilitating the Development and Management of Creative Assets
Sutherland et al. Producing Videos that Pop
Zorrilla et al. A Novel Production Workflow and Toolset for Opera Co-creation towards Enhanced Societal Inclusion of People
Wang Innovation of Film and Television Screenwriter Education in the Era of Mobile Internet
Guimarães Socially-Aware Multimedia Authoring
Baumgärtel et al. Van Gogh TV´ s “Piazza Virtuale”–Report-In-Progress and Preliminary Case study
Van Tassel et al. Managing the Production Process
Thomas Producing a Scheduled TV Magazine Based on Voluntary Student Work: Metro TV
Kent Careers and Jobs in the Media
Rajamanoharan ICT Influence on television newsrooms
Sites in2010, the harold B. Lee Library (hBLL) Multimedia Production Unit, Brigham Young University, uploaded the video “Study Like a Scholar, Scholar” to the popular video-sharing site YouTube. The video parodied a series of old Spice commercials—“The Man Your Man Could Smell Like”—in order to market library services, anything from laptops to snack zones, to their students. The video went viral and has accumulated millions of views in just a few years. The message? if you want better grades, use the library.

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION