Microsoft’s phi-1.5 Challenges LLMs Scaling Law, Showcases the Crucial Rule for ‘Textbook Quality’ Dataset | Synced
A Microsoft research team introduce phi-1.5, a 1.3 billion parameter model trained on a vast dataset of 30 billion tokens, remarkably delivering performance that rivals models five times its size. ...
Source: Synced | AI Technology & Industry Review
A Microsoft research team introduce phi-1.5, a 1.3 billion parameter model trained on a vast dataset of 30 billion tokens, remarkably delivering performance that rivals models five times its size. Moreover, it outperforms most non-frontier LLMs in tackling intricate reasoning tasks.