Boston Linux & Unix (BLU) Home | Calendar | Mail Lists | List Archives | Desktop SIG | Hardware Hacking SIG
Wiki | Flickr | PicasaWeb | Video | Maps & Directions | Installfests | Keysignings
Linux Cafe | Meeting Notes | Blog | Linux Links | Bling | About BLU

BLU Discuss list archive


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

php: waiting for other slow websites



It was my understanding (and admittedly I'm an AJAX n00b) that
XMLHttpRequests were limited such that they can only make requests back
to the originating server.  (The premise that this would reduce the
possibilities of cross-site scripting and some security concerns.)

The right approach, like answers to most questions, depends on what
you're doing.  If these services are returning data that may be
frequently accessed by your users, the simplest approach could simply be
to cache the results from the remote service(s). 

Ideally, you would write your code in such a way that you have a local
service that the bulk of your code is written against.  That way, the
service can be easily changed without affecting the rest of your code. 
For example, your image service may make remote requests to Flickr, or
may query a local filesystem, or both, but the rest of code only cares
that it made a request to an image service and got back the relevant data. 

To further the caching example, your service may simply take stock
quotes; but your implementation will store the data in a local database
for N subsequent requests (or M minutes, etc.) before attempting to
fetch new content.  The worst case scenario here is that the user that
happens to be unlucky page viewer N will pay the penalty.  (If an
off-cycle process can keep this data up-to-date and not be bound by a
request, nobody has to pay this penalty.) 

The last advantage in writing a local service that can call out is to
support David's AJAX suggestion.  If the service is local, there is no
problem making the request back.  Your content can send an
XmlHTTPRequest to your own service (say, getFlickerImages(...)) which
does the grunt work but doesn't hinder the initial load time of the
page.  Incidentally, AJAX and REST aren't competing technologies; your
XmlHTTPRequest can call back to a REST service which returns JSON or XML
data which your javascript can process, or even an (X)HTML representation.

Hope my rambling was helpful,
-Danny Robert

David Kramer wrote:
> page dynamically, like REST or AJAX.  The basic idea is that you create
> the dynamic parts of your page with some sort of <div> or other block
> tag with an id set.  Then you have a JavaScript function (called by
> onload) get the content from the other site using XmlHTTPRequest().
> When it gets the content, it sets those <div> (or whatever) blocks to
> that content.  But the point is that the page paints without first
> waiting for the dynamic content.  It's all asychnronous.  And you can
> set those <div> blocks to some sort of "please wait..." message until
> the data gets there.
>
>
> Google for "AJAX" (more "hot" and "now" and flexible) and REST (easier)
>   

-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.







BLU is a member of BostonUserGroups
BLU is a member of BostonUserGroups
We also thank MIT for the use of their facilities.

Valid HTML 4.01! Valid CSS!



Boston Linux & Unix / webmaster@blu.org