A sensor on a Nikon 800 has about 4,000 lines per inch. It is not exactly the same as saying that it has a resolution of 4,000 dpi (dot per inch) but it is pretty close. The final resolution will depend on the optics conveying the image to the sensor and to the software making sense of the data harvested by the sensor. So far it is one of the most "resolving" sensor on the market and goes au pair with the best optics.
Increasing the resolution is tricky since the optics has physical limitations and scientists usually adopt alternative way to capture images that can make without of the optics. An example is the electron microscope.
Now scientists at the Friedrich Schiller University in Jena, Germany, have found a way to use diffraction information to extract information that allows them to achieve a resolution that is over 150 times better than the Nikon sensor: 1 million dot per inch. That means to be able to capture something that is just 26 nanometer in length.
They fire extreme ultraviolet laser light against the object they want to image and collect the diffracted photons. A software finds out the paths that the photons followed analysing the diffraction pattern and generates the image. In practice the computer simulate the lens.
This can be used to spot tiny defects in integrated circuits or to examining cancer cells to discover possibile points of attack on their membrane or look at organelles inside the cell.