Free Clip has been updated to 0.9.2, the changes are as follows:
-The plugin is now available as a Mac VST
-The Gain knob now ranges from -20 to +20 db, allowing you to reduce input gain.
-Double or alt clicking on ceiling or gain controls will reset them.
-Various optimizations to the code.
-The plugin is now compiled with the C++ runtime library statically linked by default.
-New about window.
Free Clip is an intuitive multi algorithm soft clipper/wave shaper plugin, available as a Windows VST or mac Audio Unit. The plugin allows you to choose between a range of wave shapes or ‘sigmoid functions’, from the most transparent but harshest hard clip, to the ‘softest’ but most saturated arctangent shape. You can then intuitively set the ceiling level using the slider that perfectly matches the level meter next to it. Oversampling is also available to remove high frequency aliasing; this does introduce peaks slightly over the ceiling level however, so if you intend to use this plugin as the final plugin on the master chain, ensure the ceiling level is set appropriately and/or the post-oversampling clip setting is set.
This plugin is a great way to conveniently transparently boost volume without clipping your daw, whether it be an individual stem or sound effect, or an entire track that you made. It is recommended that the hardclip, quantic or cubic shape be used for mastering, as these introduce no or minimal saturation.
The plugin can be also used as a more traditional saturation/distortion plugin by setting a ‘softer’ wave shape, such as algebraic or arctangent – simply lower the ceiling level to provide more saturation to the signal – just remember to boost the output afterwards. Alternatively you can boost the input gain into the clipper for the same effect. Increasing the oversampling value is helpful if you’re hearing high frequency aliasing in this case! Be warned that high oversampling values such as 16 or 32 times can be very heavy on the CPU.
Free Clip will be completely free of charge, and will be coming soon, available to download at www.vennaudio.com and selected distributors.
Produced with Gaucho Productions, this project is a VR experience in a Ferrari, accompanied by Alastair Weaver talking you through the supercar.
Gaucho is a great company to work with, with great enthusiasm and ideas about this new form of video and the role of traditional storytelling. To produce this piece, Adam went with Gaucho’s team and miked up the Ferrari inside and out, putting into practice everything we have learnt from previous 360 projects. As such, what you hear in the finished video is all sync sound recorded from many perspectives to build up a true 360 image of being in this incredible car.
Again, many thanks to the incredible team at Two Big Ears/Facebook 360 who continue to develop their plugins along with the creative community of 360 sound designers.
Produced with Adrift Pictures, Seance 360, our latest VR work is a 12-minute scripted horror film with spatial audio. Most VR video has been kept pretty short and experimental at the moment so this project was very ambitious – especially from a sound perspective. For more info and behind the scenes photos, look here.
Harry from Adrift was incredibly encouraging and gave us great creative freedom. Adam did the sound design, recording sound effects and ransacking the Venn library and filling out the atmosphere with each ghostly visit. With a gramaphone in the scene, Jonny composed the music to go on its wax disc. Again, we used Facebook 360 Spatial Workstation to position the sounds, and there are new updates all the time on the Facebook group, along with some very interesting discussions.
So, on our Youtube channel, we can offer you a quick 30 seconds. If you want to see the rest, get in touch with Adrift. As we wrote in the previous post, distribution is a big challenge in such a new media ecosystem as VR video, so it’s just on Youtube for now.
The video was an incredible learning curve in which we got up to speed on 360 video and audio production. First, we researched different methods for producing the video – in this case using a ball of Go Pro cameras whose images were stitched together in post production. We research 360 audio recording techniques, and rented a Tetramic – an ambisonic microphone by Core Sound – from Mutiny Media. You can hear its recordings in this video, in the moments where the camera is stationary and the Audi moves around it. We researched the post-production, testing a few different sets of plugins for mixing in 360. We used the Facebook 360 Spatial Workstation to mix the audio. These guys are incredible developers, taking questions and feedback from the community of users via a Facebook group. And seriously, if they can’t solve your problem with a single comment, their answer is usually “we’re already working on it”, and the problem is solved in the next update. Their latest update handles exporting not only the audio but the video as well, formatted and tagged with metadata ready for different platforms.
Adam was the recordist for the shoot with Gaucho, working with a great team, placing radio mics for specific FX and dialogue, hiding a Zoom H2N in the car, and using the Core Tetramic and a Zoom H6 for the exterior atmospheres. Extra FX such as the-wind-in-your-face were recorded into the Zoom H2N using their spatial firmware update which was unavailable at the time of shooting.
When it came to post-production, the ambisonic recordings were decoded within the DAW, the Zoom’s quad recordings and the mono recordings positioned in 360 space and the music made to change as little as possible when the viewer changes their focus. However, a few problems in post-production limited the use of 360 effects, as you can hear in the video. In fact, due to time restrictions the video was published originally in stereo. The version below is adapted mainly to include the ambisonic recordings, and still has a few bugs. However, in the interest of advancing people’s experiences with and knowledge of 360, here it is.
Distribution is also a challenge. Currently, VR platforms are spread across lots of different formats, requiring various hardware and file specifications. So, for now, this driving experience is available over YouTube and we intend to have other projects on other platforms soon…
For more online 360 videos, search through Facebook (who are doing great things in advancing 360 sound) and YouTube. Also, keep an eye on our web site – we have more 360 experiences to come!
There seems to be some problems viewing on Mozilla Firefox sometimes – if things aren’t going as expected, try another browser.
When producing sound effects for a sound effect library, you tag your sound effects with metadata to allow sound designers to find what they want more easily, using sound library software. Without this metadata, the file names could end up very long…
“What does Venn use to tag metadata in sound effects?”
BWF MetaEdit gives you the most control we have found over all the kinds of metadata in your sound, including metadata saved automatically by certain software.
“What about on-set recordings?”
For recordings to be synced with video, Sound Devices’ Wave Agent is excellent. Wave Agent uses tags in the ‘description’ field to store metadata about scene, take, frame rate, channel name, etc. This can then be read by the DAW.
Unfortunately, you cannot use Wave Agent and BWF MetaEdit together very well. While Wave Agent will show this:
BWF MetaEdit will show this:
Any suggestions of other software to use? Leave a comment!
Part four of our series on dialogue editing is here! Watch the video to check out my editing style for dialogue driven scenes. This video uses a scene produced by Arts Educational Schools as showreel material for Henry Gibbs, Arman Mantella and Nicole Sawyerr. If you’re in the business of hiring actors, check out their spotlight links in the description.
If you’ve got some ideas of alternate ways to do things, put them in the comments!