The applicability of digital holography for a specific application is often defined by the resolution and the extent of the area that can be measured. The size and resolution of a hologram has a direct impact on the maximum surface angle and the maximum size of the area which can be detected on an object, as well as the minimum feature size that can be identified. All of these limitations can be expressed in terms of the space bandwidth product (SBP). In standard digital holography the SBP is directly dependent on the number of pixels that the employed CCD or CMOS sensor can provide. The increase of the SBP of a recorded hologram can be achieved using a number of approaches, that generally fall into two categories. Firstly, by recording several holograms, which are far apart the size of the measured hologram can be increased. This approach is already known from radar technology and is referred to as the synthetic aperture technique, as the hologram aperture is artificially increased. Secondly, by combining several holograms with low resolution and small shifts between the holograms, the resolution can be increased. This technique is known from image processing, where it is referred to as super resolution. The current techniques for super resolution in digital holography are generally based on approaches that were developed for the processing of photographs and similar real valued images. All of the current approaches have the same short comings: Firstly, they require highly accurate positioning of a sensor along different lateral positions in space. Secondly, they cannot tolerate instabilities of the setup that lead to unintended and unknown phase shifts during the measurement. These short comings impose additional requirements on the stability of the setup and require costly positioning elements, which counteracts the original benefit of resolution enhancement: Being able to measure a high SBP with a cheap setup using low resolution sensors. In this work it is shown, that a wavefield can be retrieved from multiple measurements with unknown lateral shifts and unknown phase shifts of the object wave. An approach is deduced, which allows the recovery of the lateral shifts and phase shifts using only the recorded intensities. It is shown that for the noise free case, this new approach converges to the correct solution. Further, an approach is deduced, which allows the direct calculation of the wave field from the measured intensities using a formalism, which introduces four matrixes, which are calculated from the shape of the pixel, the lateral sensor shifts, phase shifts of the object wave and the recorded intensities. Thereby, the relation between the wave field and the recorded intensities is inverted. Three general factors limiting the achievable SBP are identified and explicit terms for the limit of the SBP are derived: The combination of pixel size and noise, as well as the error in the determined lateral shifts and the number of intensity measurements taken. The experimental realization of the approach shows that the factor that is most relevant in practice is the number of intensity measurements, which is limited by the decorrelation of the object wave.