Oops, 7 Days. Hey look I don't update on weekends.

best way?

I'm trying to update a table in a database based on a query from another database (same database platform, different instances and possibly on different servers).

I'm starting to write this as a service. I am thinking of designing this for easy future expansion.

I may need to add to the list of possible queries/cross-updates in the future.  So I'm wondering what the easiest way to do this would be. If I have a service running, I would simply add to the list of stored procedures for it to execute. 

If I instead, have that service read a text file with the list of update requests, I would have to update the text file AND the stored procedures. 

I would also like the user to have a list of updates they can turn on and off.  Hmm.  Maybe this is just best to do without the extra xml/text file to refer to.  But I am trying to foresee circumstances here...how will I pass parameters back and forth?  Maybe a stored procedure could be used with code to generate a set of inserts/updates to be executed.  So my service would use a couple of basic stored procedures, which generate the data from one database, and the service could then call a command generator that creates the inserts/updates from that dataset, then executes them on the other database.

I think my biggest concern is schema changes in either database and I would rather not have to write a new stored procedure every time someone wants a new kind of insert/update from one db to the other.

Rambling. My head isn't on straight today.
Permalink Send private email sharkfish 
September 2nd, 2006 1:13pm
Specific platform? Most major RDBMS' have capabilities or utilities to do this natively.
Permalink Send private email Edgar Codd 
September 2nd, 2006 1:32pm
+1 to fish-face.  RTFM and all that...

(Old Edgar looks a bit like our very own Clay "that's not a forehead, that's a fivehead!" Dowling.)
Permalink Send private email Mat Hall 
September 2nd, 2006 1:52pm
You have the mind for it, but then the word 'easy' and what is the 'pat' way, or even product way, for doing xyz confuses things.
Permalink LinuxOrBust 
September 2nd, 2006 1:53pm
"Most major RDBMS' have capabilities or utilities to do this natively."

With SQL Server?  There's SQL.  But I have to do this every so many milliseconds.  What way are you talking about?  DTS?  Not doing that. Why?  Because sometimes, people only have the desktop engine, which has no tools like that.

I hate asking you guys questions sometimes.

Always some shithead out of the blue.

FUck off Codface.  If you are going to give an answer like that, don't answer.
Permalink Send private email sharkfish 
September 2nd, 2006 4:30pm
The generic problem ... being able to modify everything about the db - the schema, data and stored procedures ... is not an easy one. That's why native replication services are recommended.

What's wrong with just keeping the data up-to-date? After all when the schema and SPs change is under your control.
Permalink victim of my own anonymity 
September 2nd, 2006 5:49pm
I was waiting for you guess it.  Part 1 - How about starting off with a service that queries the database schema for changes?
Permalink LinuxOrBust 
September 2nd, 2006 6:38pm
for you to guess it (or come up with it).
Permalink LinuxOrBust 
September 2nd, 2006 6:39pm
I wonder why you bother asking here...
Permalink Send private email Rick Tsang 
September 2nd, 2006 6:49pm
I did something very much like this at a past company. When I started it was a 3 person department to keep all the databases synched. It was down to me, and the parade of folks I was training to replace me, but who'd quit after 2 months.

Originally it was a pile of Access 2 VBA (before my time) that eventually evolved into hundreds of thousands of lines of Access 97 VBA (my time), and with the breaking changes that Access2k had (out with DAO and in with ADO, and about 90% of the code needed to be rewritten - where the fuck did CurrentDB go to?), I was racing to move it to a combination of VB, SQL Server and RAIC (Redundant Array of Inexpensive Computers = that warehouse of pentium 75/90s that hadn't been fully depreciated yet).

I set up a scheduler that basically ran through a table looking to see if it was time to run reports/processes. If so, in one of the columns, the name of the DCOM object was there as well as the last time it ran, what family of parameters it took, and current status. There was also a watchdog timer so that the scheduler could tell if something broke, and give the job to any other computer able to do it. All of the DCOM objects had identical calling parameters, although I suspect these days you'd just use reflection to determine what they were able to understand. A future enhancement was to page folks in operations to go over to the stack of steaming poo (RAIC) and reboot the borked box. Back then, Component Source shipped CDs which were easy for a heavy Access user to crack, and one of the toys on the CDs they'd send out like AOL floppies was a pager ocx for sending alphanumeric pages. Nowadays, most carriers have an email interface for pages, and crackberries take whole emails. I think this is where I mention walking to school 15 miles through the snow and it was uphill both ways, and we programmed in 1s and 0s and when we needed 0s, we carved them out of potatoes. ;)

Some of the jobs would take 10s to run on a 333mhz celery box, some would take 8-10 hours to run on a 1Ghz xeon (which was close to top of the line back in 99/00).

The processes for moving data around, let's call one "the jones report" and the other one "make the donuts." The scheduler would poll the score-board table every couple minutes to see if it was time to make the donuts (every afternoon at 5pm) or time to run the jones report (only on thursdays, except when it was the last week of the quarter, then every hour as folks were busy trying to close out their quarter's numbers).

Some of the processes were db to db moves, and some of the processes were manual ones involving importing data to/from floppies/cds, crunching it and moving it to other places. One of the more fun ones I came up with on my own took area code splits from nanpa and updated the list of approx 70k pharmacies and 600k prescribers in our databases.

About anyone could watch the scoreboard, and authorized folks could move things ahead. We don't need to make the donuts at 5pm today, we need to make them at 1pm.

>...have that service read a text file...
NT Services running as localsystem don't have access to remote drives. That's a gotcha that you can spend too much time debugging. In addition, they don't have access to the user's desktop except under special circumstances.

http://www.amazon.com/gp/product/0471157546/
http://www.crazyontap.com/topic.php?TopicId=9116

Email me if you want more details, ideas, or if this almost makes sense.
Permalink Send private email Peter 
September 2nd, 2006 7:06pm
Aha!  Peter is actually Albert D. Kallal!
Permalink Send private email Jöhn Härën 
September 2nd, 2006 8:05pm
"I hate asking you guys questions sometimes."

A simple "I'd like this to be platform agnostic" was a perfectly acceptable response to the question about which platform.

Ask a question in a forum full of programmers and they want to understand the problem better so they can give you the best answer.

Yeah, boggles the mind. Totally annoying.
Permalink Send private email Gilbert Gottfried 
September 2nd, 2006 9:14pm
No. But I do like to annoy Kallal. He is more of a MS suckup than philo. Although now that I need a copy of win2k3-sbs, I better suck up to philo.
Permalink Peter 
September 3rd, 2006 12:12am
"I set up a scheduler that basically ran through a table looking to see if it was time to run reports/processes. If so, in one of the columns, the name of the DCOM object was there as well as the last time it ran, what family of parameters it took, and current status."

This is pretty much what I need to do, but I don't think I need a DCOM object.

"NT Services running as localsystem don't have access to remote drives."

But it can log in to remote databases, right?

"Some of the processes were db to db moves"

I don't think I'll be needed disk access, just db to db.

I think I will want a list of updates people will want to run, let them pick how often/schedule to run them via a web page.
Permalink Send private email sharkfish 
September 3rd, 2006 9:33am
I chose DCOM so that the scheduler and the computer(s) that did the job(s) could be on separate machines. Also, if the computer doing the job locks up (could be there is an invisible error message popup), it could be rebooted separately from the rest of the stuff. Some of the jobs took hours and some of the "customer" complaints were that they couldn't get job_X run while the monthly reports were being done.

I'm not sure what would be a more modern replacement these days for DCOM.

Log into remote DBs, yes.
Permalink Peter 
September 3rd, 2006 1:07pm

This topic is archived. No further replies will be accepted.

Other topics: September, 2006 Other topics: September, 2006 Recent topics Recent topics