Need to know how to connect to Oracle (11g) databasefrom Log4net.config (using appenders) file. As I need to wright the logs to the Oracle (11g) databasewhenanyuser logon to the web application.
Note: I am succesfully able to wright the logs to the text files but not able to wright the logs to the Oracle (11g) database.
Any help is greatly appreciated.
As part of my degree I recently submitted an interim paper and one of the feedback comments from the tutor was along the lines of "consider asking the opinions of others on the database, it is not cheating as long as you properly reference your source". Fair point I thought so here I am.
In my submission I had written about the two options I had considered for the database model and why I had elected to choose the one I did. So the question arises from this, what other options are there or what would your preferred approach be?
Basically, the project is looking at migrating an existing file based process across to a web solution, and part of the system relates to the daily reporting of end of period figures and status.
At the end of each daily reporting 24 hour period, a whole bunch of items are recorded, and these are recorded for more that one site, and not every site records the same metrics, but the majority of them are all the same.
As it stands at present there are approximately 30 parameters recorded, some numerical, some textual.
The first option I considered was row per site per period, with each column representing a metric (I will just call them param, but in reality they have meaningful names).
The second option was splitting the KPIs into groups, and have table per group with one master record identifying the site and period in another table.
KPI Group A Table
KPI Group B Table
KPI Group C Table
I originally have coded the models using code first approach (I'm using EF and MVC3) using option 2, as thought it would be easier to manage and allow for easier scaling, and appears to work well.
The part I am adding in at the moment is a losses breakdown for period, so each site might have multiple entries in a loss table linked to the Master Record in a one-to-many.
So the question is, particularly to any DBA's or architects, how would you or what suggestions would you have on how best to build the database to meet the needs, allowing for potentially more kpi parameters or groups to be added in the future.
I'd vary on option two; have a master-record with period and all columns that the sites have in common (since they're dependent on the key) and keep the optional columns in a separate table, with a 1-1 relation.
It'd mean having to create a new table when a new column-definition arises, which may or may not be a problem. It also sucks when creating reports if you have to take tables in account that "might" exist.
The only issue I have is with seperating out the common columns, this would result in the grouping of columns being broken, and fragmenting a cluster of points across different tables.
Then there is the issue if you have 1 point that started as non-common point, becoming a common point due to modifications on the plant. You have then broken the rule of keeping common points together, moving the column and data across to the common table could break all the underlying code.
Likewise, going the other way, you may have a point that starts common, but due to modifications on the plant no longer becomes common.
I have been working on this on and off over the last week, and so far, it still feels more logical to group categories of points together in one table and keep the master site record entry seperate.
I have since also added in another table which records a breakdown of each individual sites losses (an entry for each root cause event) for each reporting period (one to many), and by introducing this additional table and relationship, it makes even more sense to keep any KPI metrics out of the master table.
So the question is, particularly to any DBA's or architects, how would you or
what suggestions would you have on how best to build the database to meet the
needs, allowing for potentially more kpi parameters or groups to be added in the
Your notation is not familar to me however it doesn't seem to encapsulate what I would do.
The problem of your description seems to be the following.
1. You have a number of sites
2. You collect named 'metrics' from each site
3. The collection of metrics for a specific site do not match other sites.
4. And as a guess you have not considered that over time the metrics from one site might not be the same set either.
Given that I would have probably have the following table structure
1. "Site" with "Site Id" and other information specific only to the site.
2. "Metric Description" with "Metric Desc Id" and perhaps "Value Type"
3. "Metric" which has "Site Id", "Collected Timestamp", "Metric Desc Id" and "Metric Value"
Each site results in an entry in 1.
Each metric results in an entry in 3.
Table 2 is probably managed manually.
The "Value Type" allows one to identify what the "Metric Value" represents. For example it could be a timestamp, count, time span, float, integer, etc.
The above is a bit vague because the specifics of the actual system are needed to refine it further.
However, can you give me some tutorials on how to create SQL database in visual studio 2010 using c# with backup and restore? I knew a lot in MySQL using a server package of it and currently studying SQL built-in in VS2010.