I am looking to build an email scraper to respond to apartment rental inquiries from Zillow, Zumper, and apartments.com.
When an inquiry is made, these platforms send a templated email with information in the subject and bodies of the email. In response to these emails, I would like to automatically reply with a canned email that includes a link to a google form. However, I would like to dynamically include information from the inquiry email in a link to prefill the form.
For example, Zillow’s generated email has the subject “{Name} is requesting information about {Unit}”. The body of the email also include a phone number in a structured table of data. Zumper and [login to view URL] are similar.
The prefilled link would look like this:
[login to view URL]
To be sure, the process would look like this:
1. Is the incoming email an inquiry from Zillow, Zumper, or [login to view URL]
2. Scrape email for Name, Phone, and apartment unit.
3. Respond with a pre-written email, including a pre-filled link with name, phone and apartment unit.
Hello sir
I am a senior python developer with 9 years of experience. I have experience of email scraper and am suitable for this task. I am clear about the task and ready to start the work.
Best Regards,
Yongtao
Hi.
I did read the project description and have a few questions.
1. Do you need the script as well or data only?
2. What is the format of the output data? CSV is OK? We can do other formats as well.
3. Which fields do you want to extract from the website?
4. What is the website?
5. How many results/urls are there?
Thx, waiting for these details and hope to collaborate.
Dear Customer! I have built many scrapers. This is my favorite type of job. I am absolutely confident I can complete your project easily and on time. Ready to begin immediately and finish as soon as possible. Please contact to discuss in more details. Thank you.
Hello Sir,
We are expert in web scraping using java jsoup and selenium, we already worked on similar projects for read emails using POP and IMAP and auto reply.
Please ping me we can start right now.
Thanks
Hi,
First of all, thank you for providing such detailed project description. I have experience in web scraping using Python. I will use Selenium and beautiful-soup libraries in Python for this task. Please note that, I am bidding with the assumption that there are no defences for the emailing site to prevent web scraping or automation.
Steps:
1) Opening and then logging into the site using credentials
2) Identify all the unread emails up to the start date provided during bot startup
3) Identify all the emails from the three websites
4) Read each mail and gather data
5) Prepare and send an automated response
6) After all the responses have been sent, refresh the page then follow steps 2 to 5. (Note that after page refresh instead of time provided during bot startup, time of the latest observed email will be used)
Resources required from your side:
1) Dummy account with credentials having emails from the three websites (Dummy account shouldn't have any information or content that you don't want to part with)
Which e-mailing service do you use?
Is your internet connection reliable?
Contact me, if you are interested.
Hi,
I worked as a business development executive in email marketing domain.
I was very good in data mining because in the company I have to generate my own database to generate leads.
awaiting for your response and look forward to hear from you.
Regards,
phani(Nick).
I can write this in Python or Node.js. I will write a program, which will periodically connect to your mail box via IMAP (most of mail services are supporting this), check for emails matching the template, scrape the data and send reply. I'm experienced in automating things like this.
Hi, i like to resolve tasks like this and i have experience in doing that. I think you use google mail, so it won't be hard for bot to recieve emails and then send them via google API. I can develop a bot that will do what you need, with simple web-UI to manipulate with pre-written emails and other stuff. (your google-doc is private)