AllEars menus

EpcotMatt2

DIS Veteran
Joined
Aug 9, 2007
Messages
598
Just curious if there was a way to get all of the AllEars menus in one shot like a Word document. I’ve manually copied & pasted in the past, but this obviously can take a few seconds, minutes, hours, days, etc, haha. So I figured I’d check to see if something like this already exists, especially with the most recent additions & updates. Thanks.
 
I doubt this is possible. Can't think of any way to do it without going to each menu.

BTW, don't forget you can access all the menu's right here on the DIS just by going to the dark-blue menubar near the top of this page. When you hover over "Dining" you'll see that "Menus" is one of the choices.
 
Just curious if there was a way to get all of the AllEars menus in one shot like a Word document. I’ve manually copied & pasted in the past, but this obviously can take a few seconds, minutes, hours, days, etc, haha. So I figured I’d check to see if something like this already exists, especially with the most recent additions & updates. Thanks.

I remember in the days of the "Palm" she had it set up so you can download them onto a Palm, but not sure about a Word Doc. If you're wanting the menus on your cell phone (not sure if they make the app for Android too), but there is an App called WDW Dining that lists most of the menus, it is update periodically but not sure how often. It is really handy because you don't have to fish through a bunch of different screens to get the dining menus.:thumbsup2
 

OK, this depends on you having a Macintosh computer, or a Linux computer... something with a UNIX-like command line. Open a terminal window and past the following line in:

for i in $(curl -s http://allears.net/menu/menus.htm | egrep "men_|menu_" | sed "s/^.*a href=\"/http\:\/\/allears.net\/menu\//" | sed "s/.htm.*$/.htm/" | sed "s/menu\/\/menu/menu/"); do curl -s $i | egrep "<p>|h1|h3|title" | egrep -v "Found|error|htm|Banner|Subscribe|Archive|Plan|At-a-Glance|Menus\!|Dinner Shows" | sed "s/<img.*>//" | sed "s/<a.*>//g" | sed "s/title/h1/g" >> trim_menus.html; done

This will download the list of menus, then step through them, downloading each menu and sticking it into a single html document called trim_menus.html, after stripping out extraneous stuff and attempting to keep all of the menu items. I am sure that it burps on a few of the menus, not giving complete information, but it should get most of it.

Again, if you have a Mac, you can also do (on the command line in the Terminal):

textutil -convert docx trim_menus.html

and it will spit out a Word document called trim_menus.docx, which you can them open with Word. The original HTML file should open in most web browsers. The Word document, in my tests, ran to 688 pages or so. It also keeps the original copyright information in, because I felt like it should not be removed.
 












Save Up to 30% on Rooms at Walt Disney World!

Save up to 30% on rooms at select Disney Resorts Collection hotels when you stay 5 consecutive nights or longer in late summer and early fall. Plus, enjoy other savings for shorter stays.This offer is valid for stays most nights from August 1 to October 11, 2025.
CLICK HERE













DIS Facebook DIS youtube DIS Instagram DIS Pinterest

Back
Top