This paper introduces a detailed analysis on the calculation of the absorption coefficient of multilevel uncooled infrared detectors. The analysis is carried out considering a two-level 25 mu m pixel pitch infrared detector with a sandwich type resistor which is divided into sub-regions consisting of different stacks of layers. The absorption coefficients of these different sub-regions are calculated individually by using the cascaded transmission line model, including the main body, arms, and the regions where the resistors are implemented. Then, the total absorption coefficient of the detector is found by calculating the weighted average of these individual absorption coefficients, where the areas of sub-regions are taken into account. The absorption can be calculated as a function of the sacrificial and structural layer thicknesses together with the sheet resistance of the absorber layer to find the optimum value. However, the thermal conductance of the detector must be considered while adjusting the structural layer thickness. The proposed analysis also takes the thermal conductance into account in order not to compromise the overall detector performance. Analysis shows that a maximum absorption coefficient of 0.92 for a specific two-level pixel can be obtained at the 10 mu m wavelength, while the pixel results in a time constant of 11.3 ms with 27.2 nW/K thermal conductance in the thermal simulation. It is shown that the absorption coefficient of the pixel is maximized when the sheet resistance of the absorber is 380 Omega/square, which is almost equal to the free space impedance, as expected.