Here's the scenario:
You have an ASP.Net application supported by a Microsoft SQL Server database, and for this example there won't be any caching.
For each page in your web application, what's more efficient:
Attempting to condense everything you need into one (or a few) stored procedure calls that return multiple tables with all of the data you need (which you save into a dataset),
or
to make each call separate (reading each call via a datareader), depending on how it logically makes sense.
The reason I ask is I always end up doing things the 2nd way: creating simple methods that connect to the database and grab some data for each little task I have (i.e. a method for populating a specific dropdown, another for the main information for the page, etc).
My concern is that what ends up happening when a page is requested, if I look at the SQL Profiler for the server, there will end up being up to 10 or so calls for that single page request. Is that excessive? In my mind it seems like it would be more efficient to condense data gathering by pages opposed to by tasks. Anyone have any experience with this?
A large part of my day job is working on a huge WinForms app that is backed by a 600+ table sql server database.
We do it the first way to minimise network traffic. It's believed that its better to have a single bulging envelope than a mail sack full of envelopes.
Also, bundling data for transmission should not be confused with tight coupling - the SQL to gather data can be as modularised as ever, lightly bound together with an umbrella stored proc or view.
Don't group data gathering by pages. You're too tightly coupling data and presentation. What if tomorrow the data has to go on a diff page?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With