Virvatulet

Lives in Finland Finland
Joined on Jul 25, 2005

Comments

Total: 131, showing: 1 – 20
« First‹ Previous12345Next ›Last »
In reply to:

Virvatulet: There is one big omission there: One really should only consider systems built with ECC memory for a serious workstation computer.

With lots of RAM there will be memory read errors quite frequently, caused by various factors like background radiation. Many (think they) are lucky and will seldom if ever notice these memory errors, but nevertheless it corrupts e.g. image data, potentially filesystems which might turn out really bad, cause program hangs and system crashes.

Unfortunately Intel offers support for ECC memory only with their high-end workstation and server processors, the costly Xeons. But luckily AMD supports ECC memory more broadly, just make sure the motherboard you plan on using has the support too.

@nerd2
"a recent research through Folding@home demonstrates that two-thirds of tested GPUs on Folding@home exhibit a detectable, pattern-sensitive rate of memory soft errors" (Randomly corrupt memory without physical damage.)

http://ieeexplore.ieee.org/document/5951924/

Other aspects of so called "distributed computing" like F@h in comparison to true supercomputers is beyond the scope of this discussion; they are quite different.

Link | Posted on Feb 1, 2018 at 12:21 UTC
In reply to:

Virvatulet: There is one big omission there: One really should only consider systems built with ECC memory for a serious workstation computer.

With lots of RAM there will be memory read errors quite frequently, caused by various factors like background radiation. Many (think they) are lucky and will seldom if ever notice these memory errors, but nevertheless it corrupts e.g. image data, potentially filesystems which might turn out really bad, cause program hangs and system crashes.

Unfortunately Intel offers support for ECC memory only with their high-end workstation and server processors, the costly Xeons. But luckily AMD supports ECC memory more broadly, just make sure the motherboard you plan on using has the support too.

The problem at the Virginia Tech’s Advanced Computing facility was some months later solved with a new version Apple Power Mac G5 computer Xserve that featured ECC memory.

The at the time in supercomputer class system, named System X, ran fine for years and was decommissioned on May 21, 2012.

Link | Posted on Jan 31, 2018 at 01:59 UTC
In reply to:

Virvatulet: There is one big omission there: One really should only consider systems built with ECC memory for a serious workstation computer.

With lots of RAM there will be memory read errors quite frequently, caused by various factors like background radiation. Many (think they) are lucky and will seldom if ever notice these memory errors, but nevertheless it corrupts e.g. image data, potentially filesystems which might turn out really bad, cause program hangs and system crashes.

Unfortunately Intel offers support for ECC memory only with their high-end workstation and server processors, the costly Xeons. But luckily AMD supports ECC memory more broadly, just make sure the motherboard you plan on using has the support too.

Haven't checked it, but I bet that that is a RAM driven out of OEM silicon specs, overclocked that is. When reliability is a concern one doesn't overclock RAM, not without ECC in any case.

Link | Posted on Jan 30, 2018 at 14:01 UTC
On article Canon patents fingerprint reader for cameras and lenses (191 comments in total)
In reply to:

Nikoncanonfan: This is something I would turn off straight away, I wouldn't want it and it doesn't stop your gear being stolen

Technically speaking, the security feature can be partially integrated into the image sensor. Good luck with circumventing it; just to point out that the protection mechanism can be made robust. Not that I would deem Canon's suggested approach good or even viable.

Link | Posted on Jan 30, 2018 at 10:40 UTC
On article Canon patents fingerprint reader for cameras and lenses (191 comments in total)
In reply to:

Tommi K1: Terrible idea.

Fingerprint scanners are not even 80% reliable by detecting fingerprints, and very dramatically lower to be secure not to be fooled by a fake fingerprints.

You can't either use fingerprint reader with gloves or in dirty environments (weather sealed cameras), in wet positions (water on sensor) or so on.

So all that system does is just cause trouble. As a human user interface needs to be 100% reliable, so every input you do is recognized correctly. Even a 99/100 inputs reliability can be frustrating depending what it does.

Oh well... Someone could have done simple PIN code that is required every day, and you can't unlock camera without inputting a manufacturer own PIN (like a PUK) code that would be located in your warranty card or so if you input 5 times wrong your own PIN code.

Think about it... No way to flash firmware or reset camera to reset that PUK code.
Then only way would be swap some logic board inside that would be too expensive for thiefs.

I couldn't agree more.

In addition to the protection methods you suggested: One could use NFC, Wi-Fi or Bluetooth connection (now that many cameras have those) and an external key dongle to enable the camera; a smartphone key app would be one possibility too.

I'm sure that many current cameras can be made lockable and protected this way just with a firmware update.

Link | Posted on Jan 30, 2018 at 00:34 UTC
In reply to:

Virvatulet: There is one big omission there: One really should only consider systems built with ECC memory for a serious workstation computer.

With lots of RAM there will be memory read errors quite frequently, caused by various factors like background radiation. Many (think they) are lucky and will seldom if ever notice these memory errors, but nevertheless it corrupts e.g. image data, potentially filesystems which might turn out really bad, cause program hangs and system crashes.

Unfortunately Intel offers support for ECC memory only with their high-end workstation and server processors, the costly Xeons. But luckily AMD supports ECC memory more broadly, just make sure the motherboard you plan on using has the support too.

"An even more dramatic example of cosmic-radiation interference happened at Virginia Tech’s Advanced Computing facility in Blacksburg. In the summer of 2003, Virginia Tech researchers built a large supercomputer out of 1,100 Apple Power Mac G5 computers. They called it Big Mac. To their dismay, they found that the failure rate was so high it was nearly impossible even to boot the whole system before it would crash.

The problem was that the Power Mac G5 did not have error-correcting code (ECC) memory, and cosmic ray–induced particles were changing so many values in memory that out of the 1,100 Mac G5 computers, one was always crashing."

Excerpt from IEEE newsletter, the Spectrum:
http://spectrum.ieee.org/computing/hardware/how-to-kill-a-supercomputer-dirty-power-cosmic-rays-and-bad-solder

Link | Posted on Jan 29, 2018 at 21:16 UTC
In reply to:

Virvatulet: There is one big omission there: One really should only consider systems built with ECC memory for a serious workstation computer.

With lots of RAM there will be memory read errors quite frequently, caused by various factors like background radiation. Many (think they) are lucky and will seldom if ever notice these memory errors, but nevertheless it corrupts e.g. image data, potentially filesystems which might turn out really bad, cause program hangs and system crashes.

Unfortunately Intel offers support for ECC memory only with their high-end workstation and server processors, the costly Xeons. But luckily AMD supports ECC memory more broadly, just make sure the motherboard you plan on using has the support too.

@nerd2 Memory subsystem using ECC is as fast as it is specified to be. And it doesn't have to be anything meaningfully slower, it's an all hardware logic based system. As a "nerd2" you likely know that ECC memory structure has been utilized as high-end CPUs' internal cache memory also, there aint any faster memory available than that. BTW, hope you did read the linked study paper.

Link | Posted on Jan 29, 2018 at 20:35 UTC
In reply to:

Virvatulet: There is one big omission there: One really should only consider systems built with ECC memory for a serious workstation computer.

With lots of RAM there will be memory read errors quite frequently, caused by various factors like background radiation. Many (think they) are lucky and will seldom if ever notice these memory errors, but nevertheless it corrupts e.g. image data, potentially filesystems which might turn out really bad, cause program hangs and system crashes.

Unfortunately Intel offers support for ECC memory only with their high-end workstation and server processors, the costly Xeons. But luckily AMD supports ECC memory more broadly, just make sure the motherboard you plan on using has the support too.

There probably is newer material now available, but this study might shed some light on the subject:

http://www.cs.toronto.edu/~bianca/papers/sigmetrics09.pdf

The fact is that RAM memory read errors do occur all the time; it's just a lottery on its consequences. The more the memory and the higher the density of VLSI devices, the higher is probability. I acknowledge that the above referred 10 year old study suggests that newer generation memory IC wouldn't increase the relative probability of memory errors, but I suspect that this is not the case any-more with the current DDR3 and newer devices.

Interestingly ECC memory has been used even in professional mid- and high-end display adapters for quite some time now, of course GPGPU high-performance computing being a major factor here. Both AMD and Nvidia offer several models with ECC memory support.

Link | Posted on Jan 28, 2018 at 00:44 UTC

There is one big omission there: One really should only consider systems built with ECC memory for a serious workstation computer.

With lots of RAM there will be memory read errors quite frequently, caused by various factors like background radiation. Many (think they) are lucky and will seldom if ever notice these memory errors, but nevertheless it corrupts e.g. image data, potentially filesystems which might turn out really bad, cause program hangs and system crashes.

Unfortunately Intel offers support for ECC memory only with their high-end workstation and server processors, the costly Xeons. But luckily AMD supports ECC memory more broadly, just make sure the motherboard you plan on using has the support too.

Link | Posted on Jan 26, 2018 at 23:59 UTC as 15th comment | 13 replies
In reply to:

noyo: I don't see any mention as to whether the screen is shiny or non-shiny. It could be I've missed it, but if it is not non-glare/anti-reflective I'm not interested at any spec or price. Anyone know?

Actually Philips withdrew from that move (due to intended buyer's financials) and returned to the business with some new partners, namely TP-Vision.

Link | Posted on Dec 5, 2017 at 10:54 UTC
In reply to:

minababe: I don't see anything wrong with this. If a commercial photographer wants to exploit a place for profit, the place should be allowed some sort of compensation in return.

How about a production team visiting the place, probably more than once, using area's hotels, restaurants, transportation and local gear rental, guide/translator services, possibly hiring locals as assistants; and last but not the least, giving the place free visibility and media coverage?

That is a reasonable compensation, worth much more than a few thousands Euro fee, but obviously harder to quantify. I believe this kind of milking of your best friends, that is visitors a.k.a. customers, is short sighted and will prove to be a counter-productive practise.

Link | Posted on Nov 4, 2017 at 00:35 UTC
In reply to:

Virvatulet: Sorry to be a bit critical, but the DJI's DL-mount has problematic construction, namely the location of electrical connection pins. Embedding the lens contacts into the metal lens mount ring will introduce reliability issues.

Even with professional holistic approach in use and preventative care of equipment there will eventually be build-up of metal doped dust and even chips inside those contact pits when repeatedly attaching bayonet mounted lenses. Electrically conductive dirt might lead to short circuits and impedance imbalance, the latter being relevant with high speed in-circuit data links.

I know that this type of design has been used before, Pentax comes to my mind, but nevertheless I would have liked to see safer design decisions with these new developments.

There is good room for many alternative arrangements of the connector, even with these dimensions. I would say the chosen construction reflects designing inexperience, style over function attitude and maybe, just maybe manufacturability considerations.

Link | Posted on Oct 14, 2017 at 22:29 UTC

Sorry to be a bit critical, but the DJI's DL-mount has problematic construction, namely the location of electrical connection pins. Embedding the lens contacts into the metal lens mount ring will introduce reliability issues.

Even with professional holistic approach in use and preventative care of equipment there will eventually be build-up of metal doped dust and even chips inside those contact pits when repeatedly attaching bayonet mounted lenses. Electrically conductive dirt might lead to short circuits and impedance imbalance, the latter being relevant with high speed in-circuit data links.

I know that this type of design has been used before, Pentax comes to my mind, but nevertheless I would have liked to see safer design decisions with these new developments.

Link | Posted on Oct 14, 2017 at 13:50 UTC as 16th comment | 4 replies
On article Video: Removing a stuck lens filter... with a band saw (139 comments in total)
In reply to:

Stacey_K: One more reason to not use a "protective" filter. I'm sure all that torque on the lens didn't hurt anything.

I'm pro UV/protector filter guy and I use them my self extensively too. I have never had a problem with persistently stuck filters. Hard to remove sometimes yes, but not permanently attached.

However, your notion about the torque on the lens (its internal mechanism) is well founded and one should always support the lens from that front part (usually the lens hood bayonet) where the filter is screwed in when removing or attaching a filter.

Optimally during the process the lens or camera combo should be supported in a way that allows it to move easily should there be any residual force transmitted to it. For example, using a neck strap or placing the camera on a soft surface like bed or soft furniture cushion works well (note: a loose pillow can be unstable, so be careful).

Link | Posted on Jun 28, 2017 at 23:24 UTC
In reply to:

Tom Holly: The image quality of compact with all the convenience of a DSLR

The optical formula of the lens may limit achievable stabilization effect. Tamron is not neccessarily skimping, just making a reasonable compromise. That said, I was a bit puzzled about the VC performance too.

Link | Posted on Jun 23, 2017 at 09:58 UTC
On article Fujifilm announces development of EF-X500 flash (78 comments in total)
In reply to:

arhmatic: Probably a beginner question regarding wireless - does this mean that the flash can be located anywhere in the room, and it can be triggered with the camera, within a certain range? Does the camera need to be wireless capable?

What one will need for wireless remote control is any compatible (that would be a Fujifilm system) flash that can operate in master mode (or a pure remote controller unit without flash functionality).

This Fujifilm flash has both master and remote modes selectable, so one would need for example at least two of these; one flash unit attached to the camera working as a master and the other unit as a slave in remote mode.

BTW, I don’t know how Fujifilm has implemented it and I have not exclusively tested it with my Canon flashes, but my real life experience is that the master flash doesn’t light up the subject because there is a small intentional delay after the master’s communication optical pulse train has completed transferring remote commands. I should specifically test this to be certain.

Link | Posted on Jan 16, 2016 at 00:20 UTC
On article Fujifilm announces development of EF-X500 flash (78 comments in total)
In reply to:

Digimat: long overdue.

but why the hell do all on-camera flashes have to look like they came to us in a delorean?! i mean its not the 80s anymore...a bit more compact, quality and design isn´t something bad...its 2016 after all.

and why is there no radio communication? even those cheapo china things can do it pretty reliable...

To me this Fuji flash looks like a tool with pleasing and stylish utilitarian form language. By the way the compactness comes with limitations, particularly within the power-recharge department and heat management. One can’t have it all, I’m afraid.

Link | Posted on Jan 15, 2016 at 23:07 UTC
On article Fujifilm announces development of EF-X500 flash (78 comments in total)
In reply to:

arhmatic: Probably a beginner question regarding wireless - does this mean that the flash can be located anywhere in the room, and it can be triggered with the camera, within a certain range? Does the camera need to be wireless capable?

Well, actually optical remote control of flashes can and often will work reliably also thru reflected light, particularly when operating in a relatively small confined space like a room in an apartment.

If optical communication is used in outdoor environment or there are very strong other light sources like sunlight coming thru windows then direct optical path from master flash to the slaves is required for reliability.

I have used optical remote control (Canon) successfully even in a church, so I wouldn’t necessarily deem this as a disadvantage, especially because optical communication is free from regional regulatory problems and your neighbours’ WLAN, microwave oven etc. won’t ruin your day.

Link | Posted on Jan 15, 2016 at 22:57 UTC
In reply to:

johnsmith404: This might be the future of photography. Taking multiple 'sequential' exposures and extracting all kinds of information is the way the human brain does it vision job. Related mechanisms lead to high resolution (hyperacuity) and low noise.

I wonder if I'll live long enough to see the first neuromorph processing engines in cameras.

@ newe

...of you.

Link | Posted on Aug 6, 2015 at 03:34 UTC
In reply to:

AbrasiveReducer: Some impressive insults here. Spin this as you like; the idea is that if the person taking the picture manages to make money from it, somebody else wants that money. You see, the folks who own skyscrapers are having trouble making ends meet but you, the photographer, can help.

Just as stock photos create a revenue stream, helping Gates and Getty put food on the table.

Indeed, that is quite a "choice" to be required to point the camera against one's desired direction when operating at public places.

One could formally argue those from being truthfully public anymore, and I know that the right to use public space is stronger justice principle than a copyright, or at least I thought that I knew. This copyright issue has spun out of control because of incompetent and influenced decision making.

Link | Posted on Jul 9, 2015 at 01:28 UTC
Total: 131, showing: 1 – 20
« First‹ Previous12345Next ›Last »