I have a Lambda function and its taking about 120ms with 1024mb memory size. I then checked the logs and found out that it was using only 22mb at max, so I tried optimizing it by reducing to 128mb.
But when I did that the time went from 120ms to 350ms and only 22mb memory was being used. So, i am a bit confused. How does memory allocation impact processing time.