Machine learning has achieved significant progress in recent years, with models matching human capabilities in diverse tasks. However, the true difficulty lies not just in developing these models, but in utilizing them optimally in practical scenarios. This is where inference in AI comes into play, surfacing as a critical focus for experts and tech