I built out the ELK stack at my company. It handles about 400 gigs of logs a day. Logging clients connect to an ELK stack in aws, directly over the internet, using log-courier with client/server certs for authentication. If you're having trouble setting up the backend (elasticsearch), I'd highly suggest you don't try to manage the cluster yourself. Loggly/papertrail/logentries are all cheap enough at this point in time unless you are pushing lots of logs.
The project is a bit unclear. It sounds like you want to ship your logs somewhere locally, and then also up to a cloud service for longer retention. Do you really need a local ELK stack, or do you just want a "collector/aggregator" that only holds the data in transit and ships up to the SaaS? Could you reach out and explain a bit more?