Primary purpose of robots.txt file on your website is to tell search engines and crawlers what NOT to index.
One common mistake is to not fill it in correctly, this for one can cause google to think the file is invalid.
This usually happens if you think 'i dont want to block this site' so i wont put anything in it.
My advice when creating a robots file is to create a rule to block an imaginary folder or file.
(replacing mywebsiteURL with whatever your site is)
If you have things you actually need to block , then use them instead, just dont leave it blank.