Machine learning has advanced considerably in recent years, with models matching human capabilities in various tasks. However, the main hurdle lies not just in developing these models, but in deploying them effectively in everyday use cases. This is where inference in AI comes into play, arising as a critical focus for scientists and industry profe