All is possible

Month: September 2015

Digital miracles

When shooting with a normal (amateur) camera, dslr or other low cost medium the footage is captured with a decent quality, designed to be viewed and edited as is, then to optimize the quality and the space occupied in memory a subsampling of the color is performed, so that less color information has to be recorded.

850px-Chroma_subsampling_ratios.svg

so a classic video has color recorded with 4:2:0 sampling, this means that once decoded into RGB the red channel will have much less information than the other colors; in a normal situation it will not cause any particular problems and most people will not see the problem, but …

during postproduction, saturating the colors brings out the problem causing an increase in the blocking problem, which is the display of the codec compression blocks, as you can see in the image below.

why red is a bad color for filmaker

There are colors based on the red channel that can give obvious problems, as you see in the image, ruining a shot.
Sometimes, and I stress sometimes, you can save these images by converting them in the most appropriate way, using utilities that upsample the red channel, so as to reduce those blocking effects.

There are different tools that act on these channels to reduce the defects, depending on the tools you use you can rely on different solutions :

  • Inside the RedGiant Shooter Suite the Deartifacter tool
  • The standalone 5D2RGB conversion utility that converts 4:2:2 files to Prores 4:2:0 files
  • The old HD LINK program from the Cineform Pro and Premium suites (no longer available thanks to GoPro eliminating the suite).

i personally recommend the RedGiant suite because you have more control elements, as well as many useful tools for any filmmaker.

The importance of backup

In a world where you use so many words disproportionately, and in particular the word Cloud, never before is backing up your data is key.

If something happens to your computer, your smartphone, your camera cards, your camera cards… you would lose everything … your data, your memories, your work…

I know what most people think: “So much won’t happen to me, the data is safe on my hard drive, I have a copy in the cloud…” etc…

Well… I’m going to tell you something disturbing… none of these storage systems is secure, no one guarantees you the salvation of your data and especially when you activate one of these services, when you buy a hard drive or a card, the only guarantee that you are given is that in the case of some cards, in case of data loss or loss, they return only a new card…

If you are on my site it means that we have something in common, for example do 3D animation, videos, images, photography, and therefore losing your data can be a problem not a little…

Many of you have a backup system and feel safe…

well do a couple of researches on the loss of The ToyStory 2 projects, and then come back here… you may find that no one is safe since an incredible company like Pixar risked losing Toystory 2’s designs to a trivial filesystem problem, and they have hundreds of servers and technicians who trade themselves in handling and managing backups… Now…

I have a raid, I’m safe…

I happened to hear often these words, even I was convinced, too bad that it was precisely the raid that betrayed me 10 years ago, when identical disks (because supertechnics advise to take identical discs for raids, even better that they are with consecutive serials, so they work better, say the ignorant) abandoned me at the same time, so my mirror raid made me hello… through my tears …

Then I relied on a wider raid, with 4 redundancy discs, guaranteed by the super experts, too bad that this time the damage was caused by a firmware defect on new discs, an entire pallet of hundreds with the same defect, recognized by the parent company, but so much my data had died, which made sure that within a few minutes, the head began to crash into the discs , and in a short time the damage had spread beyond the recoverability of the redundancy raid. more tears shed, about 6 years ago…

A solution?

no one can have the ultimate solution, I can only say what I use as a solution for backing up my data, three copies of the data : one local on the computer, two on external hard drives, updated on alternate days.

Each block of disks is of different brands, of different manufacturers (some brands are produced by the same manufacturers, with the same chipsets and hardware), to avoid chipset and firmware errors.

What do I use to keep my backups up to date?

under windows and under mac there are several packages to check the synchronicity of the data, to avoid updates by hand, because it is not possible to remember every single file updated from time to time.

under mac I used an app called Syncron, good until you are under HighSierra, from Mojave and upper it give a lots of troubles, AVOID IT; under windows I use the AllWaysinch program. For both OS an interesting solution is FreeFile Sinc.

Both have automation systems to synchronize multiple folders both on-premises, network or cloud.

Edit 2020: Synkron seems to have not been updated by the author and does not work properly under Catalina, I suggest another interesting free product for Windows, MacOsX and Linux that is called FreeSync

USB 3.0 … it depends let’s say 2.5 vah..

EU-4306_USB3

USB 3.0 an evolution of the standard to increase data transfer speed xx times.

Everyone happier that for copying data, securing our data, photographs, movies, the time is greatly reduced.

The market is full of usb 3.0 storage peripherals, laptops and desktops have almost only usb 3.0 ports, a paradise … almost …

The theory says that the 3.0 standard takes us from 60 mb/s to 640 mb/s so we are talking about over ten times faster in transferring data between devices.

The theory, but what about the practice?

That is the theory, while the reality is quite different, because the real performance is far superior to usb 2.0 but there are often bottlenecks that are not considered.

  • the speed of the computer’s usb 3.0 card chipset
  • the speed of the chipset of the data carrier controller
  • the speed of the media from which the data is copied
  • the speed of the media to which the data is copied
  • if the source and source media share the same controller the chipset is able to distribute the data stream evenly.

Let’s take a practical example : I buy the external USB 3.0 disk of known brand XX (considering the speed at which models and devices change, it makes no sense to indicate brand and model, since the same model purchased several times contained different disks of different speeds), I try to copy some data and I find it definitely slow…
I try to change the computer port, nothing; I try to change the computer, nothing; I try to change the data source, nothing… I’m a stubborn person, I open the disk box (voiding the warranty, but whatever), I find out that the disk contained is a 3900 rpm, which is a rugged, low-speed disk, which for a 2.5 laptop disk is great because it reduces the chances of damage in case of bumps and falls during rotation, but it reduces the actual performance during copying.

now in most cases, single mechanical disks don’t have the capacity to saturate the bandwidth of sata or USB 3.0, but if I use raids where the sum of the performance of the disks adds up, I might even reach it. In the average person, no one has this kind of problem, nor do they especially notice the differences.

On the other hand, those who have to handle a lot of data professionally (data backups, movie backups etc.) have to take into account several technical factors, not only relative but combined with each other, because a fast disk with little cache can be outperformed by a slightly slower disk, but with more cache; the difference in disk size affects performance, because if the disks are denser at the same RPM they offer more data output so they can offer more speed as they go up with size.

The incompatibilities that didn’t exist on USB 2.0

In a market where everyone is competing to offer the lowest-priced usb 3.0 product, it feels like paradise, but…
Not everyone knows that there are more or less strong incompatibilities between different chipsets of motherboard controllers and those of external boxes/nas/disks.

After having a series of problems with different motherboards unhooking different disks, I did a bit of research and found out that different chipset manufacturers are passing the buck among themselves for responsibility for media unhooks and/or communication problems between them. There are hundreds of threads in computer forums that point out that the couplings most at risk are when connecting chipsets:

– JMICRON JMS539 NEC/RENESAS D720200
– JMICRON JMS551 NEC/RENESAS D720200
– JMICRON JMS539 ETRON EJ168A
– JMICRON JMS551 ETRON EJ168A

when you combine these chipsets the risk, or certainty, since the resulting behavior is linear, is that the connected device will have slowdowns, disconnects every 10-15 minutes of the device.

The palliative is to keep the drivers for both chipsets up to date, disable any power saving on the drives and system related to the chipsets. There are firmware updates on the chipset manufacturers’ related sites, where you can hope to reduce the problems.

Why is it important to know which chipset we are using?

because depending on the products we can have multiple chipsets on the same machine, for example the gigabyte board I was using before had two different chipsets, and with an external board I introduced a third chipset that was not incriminated. The current Asus plate has three different usb 3.0 chipsets and so I have to be careful which USB 3.0 port I use for external hard drives, on two I have the incriminated chipsets, so if I connect the WD mini drive that doesn’t have the problematic controller, it’s okay, but if I connect the nas (I have three, two 4-disk and one 8-disk) I have to use the third set of USB 3.0 ports, which are external ports, though, so I bought an external adapter board to carry them behind within reach of the connectors, otherwise I get disconnected every 15 minutes the disks contained in the nas.

So it can be concluded that.

the Usb 3.0 standard allows us to copy data UP TO 640 mb/s as long as we copy data from a disk connected on a chipset DIFFERENT from the receiving one.

What can I do to optimize data transfer?

  • use disks connected on different chipsets and standards, such as internal disks on sata to or from fast usb 3.0 disks.
  • use external usb 3.0 disks on two different chipsets so that the chipset working on both data input and data output does not have some kind of slowdown
  • disable any kind of power saving on internal and external disks
  • disable antivirus checking on incoming data from disk X (only for safe data)
  • use software that is optimized for data copying and that uses parity check to be certain of data copying.

Powered by WordPress & Theme by Anders Norén

error: Content is protected !!