Moire is what's known as an "aliasing" error. Basically, it means
that the signal being captured contained high frequency content
that could not be accurately be captured using the sample rate in
question. The difference between what COULD be captured, and what
should have been, produces this effect.
In an optical system, it means there was fine detail in the image
generated at the focal plane which could not be properly resolved
by the image sensor.
This is completely a question of sensor resolution. Is is NOT
dependent on what kind of image sensor is used. However, the exact
appearance of the error does depend on the type of sensor being
used. Bayer pattern sensors arguably produce the most
objectionable results because they generate false colors in
addition to creating wacky "random" patterns in the image.
Strictly speaking, the "correct" way to make this error go away is
to increase the sensor resolution. The more detail that the image
sensor captures, the higher the threshold for where the aliasing
errors occur.
Alternately, you can also tackle the problem by filtering the
image. Because the problem is that there is high frequency content
that cannot be recorded, the solution is to use a low-pass filter
to remove this portion of the signal before it's captured.
This is typically known as an "anti-aliasing" filter. In a digital
camera system, it is a mild optical diffusion filter that's
actually placed over the image sensor. This diffuses ultra-fine
detail before it reaches the sensor, removing the high frequency
content from the signal.
This works fairly well, but some people don't like this solution
because it can sometimes soften the image a little too much.
Camera makers who
strive to get better resolving power are ultimately restricted by
moire and similar problems.
No, they're not. Increasing sensor resolution will make moire
patterns go away, not get worse.