Google, Bing and other search engines were not made to rely on hand-operated site submissions. This is why they crawl the network.
For those who aren’t common with what web crawling is, it simply is a process of search engines looking for new links on websites and “follow” them. If a new link leads to something valuable, the page is indexed.
It has also been theorized that Google understands how to look at different sources of data. Such as usage statistics on browsers and domain registration data, to assist with their endless chase of new webpages and websites.
To accurately summarize, it suggests that search engines understand how to search new webpages on their own, just as long as they’re linked on the web somewhere.
The truth is; there is a big chance that search engines will be capable of finding your website. Despite whether you decided to have it manually submitted or not. But, will “probably” be good enough?