Hi Ken,
Maybe you could try this technique / video / bookmarklet from the
Google PageSpeed Insights gang:
http://addyosmani.com/blog/detecting-critical-above-the-fold-css-with-paul-kinlan-video/
The idea here is to discover the above-fold, critical-path,
render-blocking CSS styles that keep a browser from well... painting
the page.
Best of luck!
Cheers,
- Eric
On Fri, Sep 13, 2013 at 10:35 AM, Ken Irwin <[log in to unmask]> wrote:
> Hi all,
>
> I'm looking for a tool that I hope exists, and that I hope someone here might be able to point me too. I want to select a portion of a web page (or of the html behind it), and be able to copy it ALONG WITH whatever CSS rules apply to that section of code. I don't want the whole 1000+ lines of css that pertain to the page, just the 5-10-100 rules that affect the styling of that section of the page.
>
> The situation: my library web page is, by university fiat, wrapped up in our university's overall web design (see: http://www6.wittenberg.edu/lib/ ). It's so complex that it's hard to extract portions of a page for reuse. I want to take the top part of the page and re-write it in a simplified (ie, not 1000s of lines on non-relevant CSS) so I can re-purpose the same look-and-feel at the top of our customizable external services (discovery layer, etc.) I could laboriously reconstruct it, but I'm hoping that something exists to help.
>
> The "inspect element" feature built into most browsers is a start. I'm hoping that some tool can leverage the same technology to look at all 50 divs at the same time and spit out a combined pile of CSS rules that will make it all look ok. Does such a tool exist?
>
> Thanks
> Ken
|