OS X Terminal for Data Movers pt. 1

OS X Terminal for Data Movers pt.1

September 2, 2016

by Bennett Cain and Edward Richardson

Like it or not, moving around piles of data is just part of the job for those working with modern digital camera systems and the files they generate. In the digital film & television production trenches terabytes of the stuff can easily be amassed in a single day. In order to stay employed, Digital Imaging Technicians, Assistant Editors, Facility Ingest Operators and others on the front lines must be able to quickly and efficiently keep up with the production's precious mountains of zero's and one's. There are many tools in this arsenal but an under-utilized one is the readily available command line in everyone's Mac Terminal app

If you don’t have much programming or scripting experience, writing Bash Commands in the terminal can be a little intimidating. And for good reason as a bad entry can be devastating — data deleted here doesn’t go into the Trash but is gone forever like it never existed. Nothing to fret over but definitely something to be aware of! 

The good news is there are many excellent tutorials out there on this topic. A cursory Google search will often yield a knowledge base for whatever you're trying to accomplish in the terminal. Once you’ve learned the basics, generic commands found online can easily be tweaked for one's own needs. 

For simplicity's sake, this article does not concern Bash on the Unix side at all. While there is a lot of cross-over, for our purposes, it's a bit of a different beast and not much point exploring at this time. 

For starters, if you’re not sure where the terminal app lives, it's found in your Mac's Applications > Utilities folder. Open it up, find a few a files you can safely experiment with, pour yourself a strong cup of coffee and let's get started. We recommend you start here and work to the end in a linear fashion as each section of the tutorial very much builds on the previous. 

 

1. Anatomy of a Bash Command

The first thing to be aware of with any programming language is syntax, which like any other linguistic grammar, has its own rules. The Terminal likes what it likes and is totally unforgiving. Any errors in syntax will not return successful results and it's important to note that all entries are Case Sensitive! 

Commands are made up of three distinct parts, let’s take a look at a very basic command that will list out the contents of my Documents folder:

ls -l ~/Documents

ls is the Utility, sometimes referred to simply as the command. There are many different Utilities and they cast the broadest net in terms of what you’re asking the system to do. You will refine the Utility by using Flags and Arguments. 

-l is the Flag, which modifies the Utility by making it more specific. There are many different Flags and they always begin with one or two dashes, followed by a lowercase letter and they come immediately after the Utility. When multiple flags are used their order isn't important and only one "-" is necessary.

~/Documents is the Argument, which tells the Utility and Flag exactly where in the file system to execute the command. The Argument usually comes after the Utility and Flag, can be simple or complex, may work on its own or with many other Arguments. Because of this, errors in syntax within the Argument may be more difficult to detect than in the more straightforward Utility and Flag. The tilda, ~, followed by / always indicates the Relative File Path. More on this in a moment. 

ls -l ~/Documents asks the system to follow the relative file path to my Documents folder (the argument) and then list out (the command) the contents in long form, vertical column showing permissions, ownership, etc (the flag). This command returns something that looks like this: 

If you see "#" in a command, the hashtag signifies a "comment," text that will not be read as part of the command but may be a useful note for someone reading it and trying to make sense of it. 

 

2. Getting Around in the Terminal Window

The most immediately useful (and least destructive) command set concerns basic file system navigation and networking. 

The first thing to be aware of regarding navigation is you can drag folders and file icons into the terminal window to create the file path instead of typing it out manually, saving yourself a lot of time and energy. 

In the graphic above you'll notice the line Bennetts-MacBook-Air:~ bencain$ several times. This is your Home Directory and is the default location of new terminal windows. It shows the name of the machine followed by the username and "$" which means the terminal is ready to accept a command from this normal, non root level user. The home directory and where are you in the file system are important concepts as they will affect how your arguments need to be written out. 

Closely related to this is the concept of the Working Directory. Wherever you are in the file system is your working directory. This location determines how file paths will need to be written out which in turn affects how your command's argument is written. 

For example, when I open a new terminal window on my machine I get this:

Bennetts-MacBook-Air:~ bencain$ 

This tells me that I'm at the home directory of the user bencain's files. By default, it's also my current working directory.

I can type:

pwd

And this will return:

Bennetts-MacBook-Air:~ bencain$ 

The command pwd for Print Working Directory tells me exactly where I am in the file system at any time. 

I can also type:

whoami

The terminal will return the name of the current user, in this case bencain.

Bennetts-MacBook-Air:~ bencain$ whoami
bencain
Bennetts-MacBook-Air:~ bencain$

Directly affecting the home directory and working directory is the command, cd, for Change Directory

cd

In the terminal, type cd and then drag the destination folder you'd like to move to. For example:

Bennetts-MacBook-Air:~ bencain$ cd /Users/bencain/Pictures

Hit Enter and this is what returns:

Bennetts-MacBook-Air:~ Pictures bencain$

When you're not in the home directory, the name of the working directory will appear before the username, in this case "Pictures." Using the pwd command once more, the following would return:

/Users/bencain/Pictures
Bennetts-MacBook-Air:~ Pictures bencain$ 

This shows me that my current working directory is bencain/Pictures. The cd command also functions as a shortcut and typing it wherever you are in the file system conveniently takes you back to your home directory. 

At this point it's worth nothing the difference between Absolute and Relative File Paths.

/Users/bencain/Pictures/samplePhoto.jpg

This is an absolute file path as it's the definitive path beginning at the top level, through the computer's file hierarchy to samplePhoto.jpg.

Users/bencain/Pictures/samplePhoto.jpg

This is a relative file path because it tells you samplePhoto.jpg is three levels down but not precisely from where. If it had a starting point "/" it would be an absolute file path. 

The tilda "~" signifies the home directory so wherever you are in the file system you can cd (change directories) to a folder in your home directory. For example: 

Bennetts-MacBook-Air:~ HDR_images bencain$ cd ~/Documents

hit Enter

Bennetts-MacBook-Air:~ Documents bencain$ 

Before, I was three or four levels deep in my Photos directory but because Documents is in my home directory, by typing:

cd ~/Documents

I can get into Documents without having to type cd /Users/bencain/Documents

Closely related to "~" is the use "." and ".." for navigating around the file system. 

"." always indicates the current working directory whereas ".." always means one level up from from the current working directory. For example:

cd ..

This will take you one level up from where you are. 

cd ../..

Two levels up.

cd ../../..

Three levels up, etc. 

What we're doing here is navigating the file system without the use of a GUI which is why ls or listing is so helpful. 

The Finder can take some time to display the contents of folders filled with thousands of large files. By typing:

ls -l

or simply

ls

Drag the desired folder into the terminal, it will quickly return a nice orderly list of that folder's contents. ls -l will return a list in one vertical column with ownership, permissions, and some other info whereas ls will list out file names only such as this:

There are many other flags you can use with ls to suit your needs. Conveniently, you can call up a User Manual of sorts right in the terminal that will show you relevant command options. 

cat

The cat command, short for concatenate, followed by a path to a file will read out its contents in text form. This is the command to call up the text for the Bash Manual:

man cat

Very helpful when you're trying to figure out options for flags or combinations of them. When you're done, you can type 'q" to return to the main prompt.

By listing directories out with ls and using cd and keyboard shortcuts, you can jump around in the filesystem almost as fast as in the Finder. 

A few other tips to increase your speed:

Hit tab to autocomplete something you're typing based on what files exist in the system. For example, if you're typing the name of one of your files, "sampleFile.txt," you could type "samp" and hit tab to autocomplete the entry. 

Hitting up and down arrows at the prompt will cycle through your previous entries. If you have redundant commands to enter, there's no need to retype, simply press up until it returns to the prompt and then hit enter.  If you’ve changed directories since the last time you issued the command there could be problems if a relative path was used or there was an assumed path from the working directory. You might perform a command on an unintended folder.

To edit a command, you can navigate the cursor with the left and right arrows. An unmodified arrow will move the cursor one character at a time, while adding the option key will move the cursor to the previous or next word. Control-A will move the cursor to the beginning of the line and Control-E will move it to the end. And by Option-clicking you can move the cursor to the the location of your mouse pointer. Notice that anything you type will be inserted into the existing command, i.e., it won’t type over what’s already there. 

Moving the cursor - a few helpful options

Control A     goes to the beginning of line

Control E    goes to the end of line

Option Forward Arrow    goes to the next word

Option Back Arrow    goes to the previous word

Option Click    moves cursor to the mouse arrow    

Pinging: 

Something peripherally related to file system navigation is the simple “ping” command. This is extremely useful for testing connections between devices on your network. In the terminal type: 

ping 127.0.0.1

Hit Enter

This is the Loopback address of your machine, also called localhost. It should always be available so you should get something like this:

64 bytes from 127.0.0.1: icmp_seq=0 ttl=64 time=0.121 ms

64 bytes from 127.0.0.1: icmp_seq=1 ttl=64 time=0.184 ms

64 bytes from 127.0.0.1: icmp_seq=2 ttl=64 time=0.110 ms

64 bytes from 127.0.0.1: icmp_seq=3 ttl=64 time=0.161 ms

64 bytes from 127.0.0.1: icmp_seq=4 ttl=64 time=0.161 ms

Now try:

ping 8.8.8.8

This is the address of one of Google’s DNS servers. There’s a good chance this server is running, so it’s a quick way to see if you have a valid connection to the internet. The output might look like this:

64 bytes from 8.8.8.8: icmp_seq=0 ttl=55 time=19.402 ms

64 bytes from 8.8.8.8: icmp_seq=1 ttl=55 time=19.951 ms

64 bytes from 8.8.8.8: icmp_seq=2 ttl=55 time=16.319 ms

64 bytes from 8.8.8.8: icmp_seq=3 ttl=55 time=27.158 ms

64 bytes from 8.8.8.8: icmp_seq=4 ttl=55 time=11.032 ms

When we see output like this, we know we have a valid network connection. You might notice the times, for example, 27.158 ms, are much longer than from localhost the ping is being answered from the outside internet. 

If what returns is this:

Request timeout for icmp_seq 0
Request timeout for icmp_seq 1
Request timeout for icmp_seq 2
Request timeout for icmp_seq 3
Request timeout for icmp_seq 4
Request timeout for icmp_seq 5

Then there’s no network connection between your machine and whatever IP address you pinged.

With ping and all other terminal commands, hit Control C at any time to pull the plug on what's running.  Pinging is by far the easiest and handiest command for network troubleshooting and should be one of the first steps. 

One last tip for this section, when the window gets too cluttered with text, you can always type clear at the prompt to push it away and give you a fresh place to start over. 

 

3. Moving Data

And now to the meat of the article.

This will actually move a lot faster now that the basics have been established. First, why would you move files around with the terminal when it can be accomplished so easily and graphically in the Finder? For most tasks, the Finder suffices but once you get into moving around thousands of files, or hundreds of thousands of files, its limitations become readily apparent. 

The problem is when the Finder copies files, it first catalogs them and prepares their graphical representation. This can slow down the CPU causing the OS to limp along on basic tasks such as populating a folder with its content. Finder also treats a copy as one operation even if it's actually thousands of files. If the Finder barfs on it, not only will the operation freeze and nothing more will be copied but every file before the error could be compromised. 

For my own purposes, I find the terminal most useful for filling in the gaps of the Finder's limitations — tasks like getting folder and file sizes for massive items, copying/moving/deleting file sets of more than 5000, listing out the contents of directories that Finder takes too long to populate among many other things.

The first handy command is just to determine how much data there is on a volume or in a directory. 

du

or

du -sh followed by the path to the volume (remember you can just drag the folder icon into the terminal.)

du is "disk usage" and will show the size of a file or folder in blocksize. It can be combined with a great number of different flags such as -a, -k but when used with -sh, the command returns the size of the directory in one nice tidy line in a "human readable" format, in this case is Kilobytes, Megabytes, Gigabytes, or Terabytes. Run du on Source and Destination folders after copying for a quick, down and dirty File Size Comparison Checksum.

The command for copy is very simple:

cp

When copying files using the Terminal, this is the order of the Source and Destination within the argument. 

command (flag) /file/path/to/Source /file/path/to/Destination

For the first and most basic example, let's say I want to copy a file called Good_Photo.jpg from Pictures to a folder on my Desktop called Best_Work but first I need to make that folder with the mkdir command.

I'm going to change directory to my Desktop and then make the new directory. Looks like this:

Bennetts-MacBook-Air:~ bencain$ cd ~/Desktop

I'll now make a new directory with mkdir


Bennetts-MacBook-Air:Desktop bencain$ mkdir Best_Work
 

Now I'm going to run my copy cp command:

Bennetts-MacBook-Air:~ bencain$ cp /Users/bencain/Pictures/Good_Photo.jpg /Users/bencain/Desktop/Best_Work

Or if I had done navigated into Pictures with cd it would look like this because I don't need to type the source file path to my current working directory:

Bennetts-MacBook-Air:~ Photographs bencain$ cp Good_Photo.jpg /Users/bencain/Desktop/Best_Work

After running this command, the file Good_Photo.jpg now exists in two directories — Pictures and Best_Work.

In another scenario, I have a file called Bad_Photo.jpg that I want to move to a folder on my Desktop called Delete_Later.

The command for move, works just like dragging a file from one Finder window to another. It is:

mv

To move my file, this is how I would write out the command:

Bennetts-MacBook-Air:~ bencain$ mv /Users/bencain/Pictures/Bad_Photo.jpg /Users/bencain/Desktop/Delete_Later

After running it, the file Bad_Photo.jpg now only exists in the directory Delete_Later.

Here are a few flags to make the mv command much safer:

-i     prompt for user interaction if a file is going to be overwritten

-n    don’t overwrite files

-v    verbose, show each file as it’s copied or moved

While you can easily just drag all this stuff into the trash, there is a command to delete it.

rm

Short for remove, this command deletes files and should be used with the utmost care especially with flags such as recursive, signified with: 

-r

This flag means apply the command to everything in the directory including subdirectories and their contents. There's a little gotcha here to be aware of.

sudo rm -r or rm -rf

sudo means super user do which signifies commands in the root level of the file system. We'll get more into this in the next tutorial but sudo commands can be incredibly powerful and inadvertently destructive. For example, sudo rm -r used in a relative file path can wipe your entire hard. Just to reiterate, data deleted in the terminal doesn't go to the trash, it is gone forever. Make sure you're on an absolute file path before deleting any files using a terminal command! The force flag (-f) can also be used with rm -rf (with or without sudo) to force the file system to delete whatever files are in the argument. 

The key to copying, moving or deleting lots of data is this -r flag, which is once again short for recursive, meaning everything in the directory. By using this along with bracket sets {..} and chaining commands with &&, mountains of data can be efficiently dealt with in the terminal. 

To assist in accomplishing these tasks, there's a phenomenal freeware called Sublime Text. "Find All" and "Replace All" can ease the pain of a complex command line based workflow by isolating and modifying the same object across multiple commands. I highly recommend writing your commands in something like Sublime also because of the Quotation Mark Problem. In terminal we want to use straight quotes ( " " ) not curly quotes ( “ ” ). By default, TextEdit enables smart quotes that will automatically turn straight quotes into curly ones. It’s common to see curly quotes in online examples for terminal. Be on the lookout! They won’t work when you try and run the command. Sublime only uses straight quotes. Sublime is your friend.

In the example above I'm copying 5000 files at a time from my working directory to a folder on a network drive. Terminal has one limitation in that it can only work with 5000 files at a time but this can be easily circumvented by chaining multiple commands using &&, which specifies that if the first command returns successful, the next will proceed.

Here's another similar case. I want to copy frames 20,000-40,000 OpenEXR files (sample_project_20000.exr to sample_project_40000.exr) from my system drive to a directory called Transfer on an external drive called Backup. At 33MB per file. the Finder would struggle with this copy so setting it up in the terminal is a better option. 

First cd to the directory where the files live.

Step 1, change to the directory you'd like to copy files from: 
Mac-Workstation:~ bencain$ cd ~/Media/OpenEXR/Sample_Project

Step 2, set up the first command which utilizes cp -r to copy all the files within the range specified by the brackets (note there are two dots between the numbers, this is the required syntax):
Mac-Workstation:~ Sample_Project bencain$ cp -r sample_project_{20000..25000}.exr /Volumes/Backup/Transfer/ &&

Step 2, && allows you to chain the rest of commands together:
cp -r sample_project_{25000..29999}.exr /Volumes/Backup/Transfer/ &&
cp -r sample_project_{30000..34999}.exr /Volumes/Backup/Transfer/ &&
cp -r sample_project_{35000..39999}.exr /Volumes/Backup/Transfer/ 

That's all there is to it. While the transfer is in progress, typing in the terminal window will be unavailable. You'll know the operation is complete when the prompt is solid and you can begin typing again. One thing to be mindful about bracket sets is the syntax is very specific and easy to screw up. For example, there can only be two dots between numbers in the range, no more no less. {25000..29999} will work but {25000...29999} will not.

When you're done you can run a du on both Source and Destination directories and compare the bytes as a quick check sum. 

RSync:

One step beyond all this is the command rsync, short for remote sync, which is so useful but with all of its options, that it's worthy of its own post. 

rsync -r source directory/ destination directory

For example: 

Bennetts-MacBook-Air:~ bencain$ rsync -r /Users/bencain/Desktop/ProjectFiles/ /Users/bencain/Desktop/BackUp

This command will copy the contents of the source directory, ProjectFiles, into the destination directory, BackUp. While similar to cp, the main difference is rsync's vesatility with more options for flag combinations. There's even Time Machine like functionality for rsync with -a, for archive, that can be written into a re-useable script. This sort of functionality can be very useful to the Data Mover looking to synchronize the content of multiple drives that will be constantly amended. More on all this next time!

 

4. Summary

From the material covered in this first article, these are the key items:


pwd     Returns Current Working Directory
whoami     Returns Current User
cd     Change Directory
~/ Relative File Path
ls     List
ls -l     List in Long Format (Vertical Column)
mkdir     Make Directory
du -sh     Disk Usage Human Readable
tab     Auto Complete
ping     Test Network Connection
clear     Clear the Terminal Widow
cp     Copy
cp -r     Copy Recursively
{..}      Bracket to Apply a Command to a Range of Files
mv     Move
sudo rm -r   Safely Delete all the Contents of the Directory (Password Prompt)
rm -rf a Delete Command that does not use sudo. Recursively and Forciby Removes all Contents of the Directory. 
-v      Verbose, view the progress of the command
cat     Concatenate (Read Out the Contents of a File)
man cat     Open the Manual to get Relevant Command options ("q" to leave it)
Control C     Cancel Process
&&     Add another Command to the Queue
Up and Down arrow keys     Scroll through Previous Entries
Left and Right arrow keys     Scroll Left and Right through the Command
drag icon into terminal     Way Faster than Typing
copy and paste into terminal     Way Faster than Typing
#     Comments
Get Sublime!

This is just a drop in the bucket. Like any other language, the more you practice, the greater your degree of fluency. Commands can be written to accomplish virtually any task within the filesystem. 

 

5. Conclusion 

That's it for this one. Next time we'll dig deeper into sudo commands, filtering with grep, using Pipes "|", more on rsync, wildcards "*" and some other advanced commands. Filtering and automation is where the real power of these commands lie, allowing you to get very specific and potentially save a lot of time. For now the best way to learn is just play around with, trying different flags and arguments — just not on any critical data! I recommend setting up a few test directories on the Desktop for learning. 

It's tempting to try and reinvent the wheel in terminal just because you can. Basic utilities like Automator can do a lot of the stuff you may want to accomplish, such as Batch Renaming but in a far easier and more intuitive way. Bash and Automator used together can be a particularly powerful combination. Additionally while you could write a custom command for running md5 hastags and other checksum operations, there are many inexpensive softwares available that do it faster and easier than whatever you may come up with. It's a fun challenge to try and figure some of this stuff out but at the end of the day, a much easier way might already exist.

Many thanks to Edward Richardson for helping make this article happen. 

Please touch base with any feedback on this series. 

 

NAB 2012 - Round Up

NAB 2012 - Round Up

I suspected that this year's show wouldn't be as overloaded with new product as last year because most of the major players have already rolled out the flagships that will carry them for the next few years. I found this to be somewhat the case but not entirely as there were definitely a handful of "show stoppers" on display. The emphasis this year seemed to be much less on new, groundbreaking wares and more "this is what we're working with now, and here's how we can do it better." 

While NAB is in many ways a portal of things to come, this year there were far fewer 3D announcements and a lot more emphasis on 4K which is evident in the current market as well. 3D has struggled to gain much, if any, traction outside of theatrical content and the resolution war is heating up now that all the major manufacturers are, or will be (IBC is next), intro-ing cameras offering greater than HD resolution. This is interesting because it's really not much different than the megapixel war with consumer digital cameras. Resolution while incredibly important is still relative to optics, image processing, presentation, and many other factors. Bigger isn't necessarily better though it's obviously a huge marketing opportunity for these vendors.

Maybe my interests have shifted somewhat as my market, broadcast bound projects, has decidedly settled on the Alexa for now. Because of this my energies are very tied up with solutions for that platform. That said, I didn't spend 3 days at the show exclusively checking out new cameras and hardware but spent much of that time researching workflow and archival solutions and demoing tons of new options for creating on-set deliverables, a topic I've covered at length on this site.

What was really excellent though wasn't all the new gack but the opportunity to meet in person so many people I've been in correspondence with. That's what's great about NAB - getting all these professionals from various facets of the industry together in the same location and the exchange of ideas and information that results. it's inspiring and I left Vegas feeling optimistic about the business and where it's going. 

Quick note, what I had on hand to shoot stills with this year was the trusty Leica M9 w/ Summilux-M 35mm Lens. My favorite camera in the world but defintiely not the best choice for shooting product closeups so I'll apologize for the the uninspired photographic component of this post. 

leicagram.jpeg

ALEXA. I was pleased to see Arri announce some very nice new features. Nothing earth shattering, but quietly useful. 

b_alexa_plus_5.jpeg

4:3 Sensor no longer exclusive to Alexa Studio:

You can now purchase an Alexa Plus with a 4:3 sensor in it. You cannot upgrade your existing camera to the new sensor which is certain to sour a few owners. However, this is nice because the Alexa Studio is a hefty rental and as neat as it is, I'm not entirely sold on the optical viewfinder. I'm not a camera operator though and a handful of my colleagues are really into it. I can definitely see the appeal. It will be great to have a more cost effective rental option for anamorphic capture or simply recording a big old square raster with spherical lenses for VFX work. The flexibility of the Alexa system continues to evolve along with the market. Ryan Koo wrote a good article on the topic >>>

2K ProRes Recording:

L1001486.jpg

Existing 16:9 Alexa sensors will soon be able to record to SxS cards in ProRes 4444 or DNx RGB at 2K resolution, 2048x1152. The new 4:3 sensor will be 2048x1536. Not a ton of extra resolution but appealing nonetheless. Also on the horizon - new debayer algorithm for improved sharpness and real time ArriRaw playback out of ArriRaw Converter.

newrasters.gif

Also at Arri's booth, Pomfort was there demoing their solution for Alexa color management, LiveGrade.

Pomfort's Patrick Renner

L1001482.jpg

I've written about this software at length and have been a beta user since day one. It's really come a long way and now that CDL and Pre or Post Linearization Color Correction has been implemented, LiveGrade is a legit on-set color management solution for any number of cameras. 

I think there is such a plethora of great NAB coverage I'm not going to spend the time creating a massive post covering all the big beats like I did last year. Here's a few things that stuck with me though - 

BLACKMAGIC DESIGNS:

I'm pretty excited about Resolve 9 but interestingly enough, this is the talk of NAB 2012 - the Blackmagic Cinema Camera aka "My First 2K", a $3000 camera that comes with $1700 of freebies (and I mean that in the MOST non-condescending way. I'm actually quite interested in this camera.. but c'mon look at it.. ViewMaster!)

L1001475.jpg
L1001474.jpg
photo-1.jpg

One thing that no one is talking about with this camera is that the sensor is quite small by today's standards, a bit more generous than Super 16. The mount is EF and these still lenses are going to be quite telephoto on this small sesnor. 3x more telephoto in fact so that super wide angle Canon 8mm is going to be about a 24mm in Full Frame terms. The other thing is the practical resolution of a Bayer pattern chip at 2432x1366 after demosaicing is a bit less than 1920x1080 with chroma subsampling around 4:2:0. That's just the nature of debayering but it does offer very robust recording formats, 12 bit Raw and Log encoded ProRes 4444 and DNx RGB. 

Resolve 9

L1001456.jpg

The users spoke and BMD clearly listened. Resolve 9 is now a full fledged dailies solution with the inclusion of audio pass-through and syncing, burn ins, super clean interface and media management, and intuitive new toolset. No word on whether the dailies component of 9 will be available in Lite or whether Lite will even continue to exist. I'm guessing you're going to have to shell out $1000 to have access to the new features which fair enough. Or you can just buy their camera and get it for free ;)

On the topic of dailies and on-set deliverables -

Everyone is getting into this game now. Assimilate was showing Scratch "The Next Thing" (working title) which is looking more powerful than ever and in my opinion Lab still offers the best cost to value ratio and user support. YoYotta was demoing realtime F65 rendering with Yo Dailies, ColorFront introed a low cost version of On-Set Dailies called Express Dailies, Filmlight has their low cost version Baselight Transfer, Adobe SpeedGrade CS6, etc. Not to mention a handful of software startups with their own offerings. Price tags on these wares run the gamut of course and each one offers its unique take on the complex problem of creating a dailies pipeline. Now that there are so many options, in my opinion the true separating factor will be support. The importance of having an actual human being to communicate with for troubleshooting, software customization, and feature requests can't be understated. Among this crop, some definitely understand this whereas others, maybe not so much. 

On a software related note - Autodesk Smoke all-in-one editing and effects package now for Mac. $15,000 $3500. Yet another once nearly unattainable pro tool looking to go mass market through aggressive pricing. 

SONY:

4k projection of a variety of material from the F65; all manner of conditions and mixed lighting. It was very good to see what this camera is actually capable of and it turns out, the potential is enormous. 

L1001471.jpg

4k can only be fully appreciated in a proper 4k projection. It's difficult to gauge the extra resolution on a HDTV or even one of the smaller 4k LCD displays that were floating around the show. Suffice to say, the image quality is remarkable. 

4k Stitch View:

L1001468.jpg

This is a very interesting application of 4k technology. 2 F65's side by side, both rasters are seamlessly stitched into one 8k picture that can you can pan and tilt around in realtime with no resolution loss until you get to 1080. It's applications like this, an unintended useful outcome of the technology, that really excites me about all this stuff. I think this technological renaissance we're experiencing in motion pictures can and should extend far beyond the realm of film/tv.  

Sony NEX-FS700:

L1001426.jpg

Everything about this has me scratching my head - from the form factor, to the generous specs (btw 250 fps at 1080p), the TBD 4k Raw recording, to the price ("less than $10,000). It's an odd one but it's a potentially very cool imaging machine nonetheless. 

CANON:

Canon EOS-1D C

canon-eos-1d-c-jjc.jpeg
photo-3.jpg

I'm way more excited about this than I thought I would be; the specs are out of control and the images coming out of the camera are really impressive. This is a true digital stills and motion picture camera. It's got the form factor of a SLR but all the video features you could ask for - clean output, multiple resolutions and sensor windows, multiple compresson schemes, etc. The 4k video isn't raw but is compressed to 422 at 500 Mbps and written out to compact flash. I didn't see any interface on the camera other than HDMI so I'm assuming a 4k raw recording via transport stream isn't possible. Regardless, I think this is THE camera for someone looking to do both high quality stills and video with one machine and not looking to spend a fortune on peripheral equipment. 

This cracked me up so I took a picture -

photo-2.jpg

ISO 204,800! On the monitor the video was looking super clean at 1600 but these days that isn't as special as it used to be. 

Canon C500:

photo-5.jpg

I think where the C300 was lackluster, this camera brings it. No one seems to know how these 4k streams will be recorded quite yet but Convergent Design is ready to accomodate whatever with the Gemini Raw. All these cameras, it's a little overwhelming. I think with all of these new acquistion options, a universal workflow is going to have to emerge or anything that comes out trying to reinvent the wheel is going to sunk before the ship even sails. Once again, we've just been spoiled by the ease of the Alexa. For large scale productions requiring a fast turnaround, vendors offering up something new need to make it as painless as possible or it's going to be a tough sell.

PANASONIC:

Behind glass and very difficult to photograph was this.. 4k "Varicam" Concept Camera.

photo-6.jpg

It's modular and comes in pieces like the EPIC and it's about the same size. Not much info to be gleaned other than AVC Ultra codec recorded to P2 Micro cards which are high capacity SD cards encased in stainless steel or some kind of tough alloy and the 4k recording is not Raw but rather Linear RGB. I'll reserve judgment but my instincts are leaning towards, "Too little. Too late."

SONNET AND THE TOPIC OF "MAC EXPANSION":

The consensus at the show regarding Apple's commitment to the pro market was grim to say the least. Even the future of 17" MacBook Pro has been called into question... I'm seriously about to start stockpiling computers. But you never know with Apple and that's the thing. They could announce something tomorrow and this discussion would be over. One has to maintain a cautious optimism with Apple products which is why I haven't started my stockpile just yet. I try and get as much mileage as I can on-set with 17" MBP's. I'll bring a tower out if I have to but my M.O. is usually to try and keep a small footprint and do a lot with a little. That said, I'm very excited about some of the stuff Sonnet is working on. Like the RackMac 1U shelf for Mac Mini's!

rack-min-2x.jpeg

And this kind of blew me away.. xMac mini Server

L1001480.jpg

xMac™ mini Server 1U Rackmount PCIe 2.0 Expansion System With Thunderbolt™ Ports

Sonnet’s Xmac™ mini Server (previously known as RackMac mini Xserver) 1U rackmount PCIe 2.0 expansion system with two Thunderbolt™ Ports mounts a Mac® mini inside a specially designed enclosure that also contains two x16 (x4 mode) PCIe 2.0 slots, a 150W power supply, and an installed Gigabit Ethernet card. This system enables users to plug in two PCIe 2.0 adapter cards (one half-length and one full-length) to slots connected to the Mac mini via locking Thunderbolt cables while allowing the connection of additional Thunderbolt peripherals to the daisy-chain Thunderbolt port.

A powerful and expandable computer that fits in your rack? This might be it. If Apple jettisons the Mac Tower, maybe they'll come out with a suped-up Mac Mini. Drop it in something like this and you're ripping. At least in theory ha.

Echo Express Pro Expansion Chassis for PCIe

These are a nice size.

echoexpress.jpeg

The Magma Thunderbolt ExpressBox 3T is a similar solution but it's a monster. The thing holds 3 PCie cards so it's nearly the size of a tower. You could get to a certain point where you're trying to make a laptop into something that it's just not. Is it worth it and is it really even feasible? At any rate, modularity is now the name of the game and I like to see lots and lots of viable options. 

9-7-2011eb3ttop.jpeg

AND ONE LAST THING:

I'm really running out of time for this post but this an item that's very cool and under the radar that I wanted to write about -

AXIS1 Single Channel Wireless Lens Control System:

L1001432.jpg

This impressively machined motor can be used for focus, iris, or zoom and the control is very nuanced. The range is similar to what you would get with a Preston. These are produced by a gentleman in the UK named Peter Hoare and the kit goes for about $5000. I've been looking for a solution for wireless Iris control and a couple of these might be it. 

That's all I've got for now. I'll try and revisit this post at a later date. 

LiveGrade to Scratch Lab

LiveGrade to Scratch Lab

Pomfort has added a native Scratch .3dl LUT output to LiveGrade. This LUT lines up in Scratch for color corrected file outputs very nicely. This is an outstanding workflow for generating dailies on commercial projects. Anything you would ever need to do for file generation you can do in Scratch and LiveGrade is a very fast and seamless way to generate LUT's for this software. Using the Avid Artist Control Surface with this software combo has sped me up considerably and closed a lot of the previous workflow gaps.