php - How can I efficiently deliver large number of assets (images)? -


i have single web page responsible delivery on 500 images browser. sizes vary between 50kb , 80kb.

the server being used nginx , there varnish proxy being used.

now, how can make process efficient can possibly be?

i have 2 thoughts , imput experienced people here.

my thoughts are:

  1. set multiple subdomains , serve batches these. believe best number of subdomains use 12.

  2. use ajax load batches browser when needed user scrolls down.

i think option 2 here doesn't solve problem; gets around it. focus on making process efficient , fastest can possibly be.

you're loading 1 page 500*50kb ~ 25mb of data, that's insane pagesize!

no matter that's gonna feel slow compared average pagesisze of around 1 mb currently. loading of dynamically via ajax when needed makes way more sense. alternatively split multiple plages.

if you're set on 1 giant non-dynamic page then:

  • make sure have cache headers set allow caching (won't hep first load)
  • the main problem (apart overall size) have awfull lot of resources you're requesting. there's 3 ways limit consequences of that:
    • use sharding (i.e. different subdomains). works because browses open 4 connections per host, using multiple domains can request/load more resources in parallel.
    • put images in sprite (i.e. 1 big image, , use css display bit want)
    • set server use google's spdy. pretty eliminates problem lots of resources. downside it's still experimental (i.e. you'll need recompile nginx patches) , not supported browers yet

Comments

Popular posts from this blog

ruby - Trying to change last to "x"s to 23 -

jquery - Clone last and append item to closest class -

c - Unrecognised emulation mode: elf_i386 on MinGW32 -