What happens if you fine-tune a llm when you have no idea what are you doing?