09-19-2013 11:54 AM
I do not know enough about web stuff to give you a good answer.
Lynn
09-19-2013 12:45 PM
I don't have much experience in this either, but the answer would be specific to the site you're trying to read. If the store locator is a simple page (or list of pages), then you can simply get the HTML and go through it, as shown earlier in this thread. If it's something more complex (like something which requires you to sign in first or perform queries using web services) it might be more complicated. I would expect that a list of stores is something that would be easy to access as simple HTML, so the best thing is usually to open the relevant page in your browser and look at both the URL and the source HTML to see what they look like.
By the way, if all you want to do is get the list of stores once, something that's very easy to do is simply browse to the page and then save it, which will give you the HTML. You can then parse the HTML and get the data from it.
If you want, there are multiple examples online, and you can find some of them here:
https://decibel.ni.com/content/groups/labview-example-challenge-march-madness-2011?view=documents
If what you need is a simple HTML scrape and parse, then I know that my example there does that:
https://decibel.ni.com/content/docs/DOC-15433
I would suggest you go and do some inspecting and learning about URLs and HTML and see what you can come up with. If you still don't manage, you can come back with the URL and what you came up with.
09-19-2013 03:54 PM
Thanks guys - this is helpful but I'm still having some trouble following you. Maybe it would be more helpfu if I got more specific... I'm doing a research project and I need the list of Lane Bryant store addresses in the U.S. The store locator page is (http://www.lanebryant.com/custserv/locate_store.cmd). If I do a search and pull up stores, then the web page's source data gives me the data I need. The problem is, I'd have to do a ton of searches in order to grab all ~800 stores. Is there any way to extract the entire list that would avoid endless searches, copying and pasting?
Thanks for the help
09-19-2013 03:59 PM
You may need to take a completely different approach. Running huge numbers of searches is likely to get your IP address blocked before you get through the entire search space, and you'd have to filter out duplicates.
I would see if there's a number you can call or an address you can email at the corporate headquarters, explain that you're doing a research project and ask if they can supply you with a list of all the store locations. Or you can do some searching on the web. I don't know if it's accurate, but 30 seconds of searching provided this list:
http://www.mystore411.com/store/listing/2205/Lane-Bryant-store-locations
09-19-2013 04:00 PM - edited 09-19-2013 04:02 PM
Google can lead you to a website which lists LB stores by state. At least that's not endless copying/pasting, just 50 times. And I bet you could automate that real easily in LV, because it seems to be static webpages, not a database.
[D*** it, nathand beat me to the site by a minute.]
Cameron
03-16-2014 06:26 PM
Here is a brief description on using web sockets to scrape a website for temp data... http://labviewtest.blogspot.com/2012/04/website-scraping-with-labview.html
06-10-2014 02:45 PM
Here is a brief tutorial on scraping HTML from websites: http://www.labviewtest.com/scrape-html-website-using-labview/
03-04-2016 02:36 PM
Hello and thank you for that great example, i have a small inccuary though. I'm trying to use the same example but to access the router http page, (by entering it's IP address). However it doesn't seem to be working for some reason. I tried replacing the url of the weather channel you provided by the url of the router and it doesnt seem to be collecting any data (string) to be compared to the desired pattern entered. Why is this happening considering im accessing the router as an http request, same as the weather channel. And is there any fix or a way around this? im making a robot and placing a web server on it for wirless communication, i want to collect the sensor data on my computer and analyse it through labivew. Thanks in advance, i'm really stuck on this!
09-03-2016 09:23 AM
And if I wanted to read the contents of the web page that requires a login ( username and password) ?
09-03-2016 11:14 AM - edited 09-03-2016 11:23 PM
@NTampelloni wrote:And if I wanted to read the contents of the web page that requires a login ( username and password) ?
Try the username/password inputs to HTTP open handle.