Showing posts with label truncating. Show all posts
Showing posts with label truncating. Show all posts

Wednesday, March 21, 2012

Logs NOT truncating with Arcserve v9?

Anyone using Brightstor Arcserve v9 + the SQL agent to backup their SQL
databases?
We are using it to backup all our SQL2000 databases (15) and it works just
fine *except* even though it is configured to do a 'complete backup' of each
database it doesnt truncate the transaction log at the end...with the
consequence that the logs bloat out the many Gigabytes.
I understood that the default behaviour for a 'complete backup' was to
backup the database+logs then truncate?
The arcserve agent give the option to run a 'transaction log' backup...(ie
logs only) but that is not what we want.
Any ideas on how to get it to truncate those logs once the backup is
completed sucessfully (other than use another product ;)
NB the CA support site for Arcserve is *REALLY* crap - every query returns
hundreds of product releases and adverts for their bloody software but no
technical information. Every time they change it, it gets worse
Al Blake, Canberra, AustraliaAl,

> I understood that the default behaviour for a 'complete backup' was to
> backup the database+logs then truncate?
You would have to ask Arcserve about this. In SQL Server, BACKUP DATABASE
does not empty the log. You either do regular log backups or put the
database in simple recovery mode. A tip is to run Profiler while the
Arcserve job is executed to catch what is *really* going on.
Also, information about shrinking files etc (as I guess you might want to do
this) is found at below side, see the links in the text:
http://www.karaszi.com/sqlserver/info_dont_shrink.asp
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
"Al Blake" <al@.blakes.net> wrote in message
news:eBeoGjGEEHA.3804@.TK2MSFTNGP09.phx.gbl...
> Anyone using Brightstor Arcserve v9 + the SQL agent to backup their SQL
> databases?
> We are using it to backup all our SQL2000 databases (15) and it works just
> fine *except* even though it is configured to do a 'complete backup' of
each
> database it doesnt truncate the transaction log at the end...with the
> consequence that the logs bloat out the many Gigabytes.
> I understood that the default behaviour for a 'complete backup' was to
> backup the database+logs then truncate?
> The arcserve agent give the option to run a 'transaction log' backup...(ie
> logs only) but that is not what we want.
> Any ideas on how to get it to truncate those logs once the backup is
> completed sucessfully (other than use another product ;)
> NB the CA support site for Arcserve is *REALLY* crap - every query returns
> hundreds of product releases and adverts for their bloody software but no
> technical information. Every time they change it, it gets worse
> Al Blake, Canberra, Australia
>

Logs not truncating

Hi,
we have a problem on one of our databases.. mainly the log is not truncating
after a backup.. By now it has grown to 3Gb, with the data file at 156Mb.
Any Ideas?
Thanks
NicholasHave a look here :-
http://support.microsoft.com/default.aspx?scid=kb;en-us;272318
--
HTH
Ryan Waight, MCDBA, MCSE
"Nicholas Aquilina" <naquilina@.gfi.com> wrote in message
news:e$cXQaKnDHA.372@.TK2MSFTNGP11.phx.gbl...
> Hi,
> we have a problem on one of our databases.. mainly the log is not
truncating
> after a backup.. By now it has grown to 3Gb, with the data file at 156Mb.
> Any Ideas?
> Thanks
> Nicholas
>|||Hi
The article helped..I had to use DBCC SHRINKFILE(Database_log,6). I shrunk
the log to 700Mb first from 3Gb, and then the rest with a backup & log
truncate worked normally. Now could the shrinkfile possibly affect any
replication? We have a snapshot replication from this same database to other
servers
Thanks Aagain
Nicholas
"Ryan Waight" <Ryan_Waight@.nospam.hotmail.com> wrote in message
news:eWDTxdKnDHA.3024@.tk2msftngp13.phx.gbl...
> Have a look here :-
> http://support.microsoft.com/default.aspx?scid=kb;en-us;272318
> --
> HTH
> Ryan Waight, MCDBA, MCSE
> "Nicholas Aquilina" <naquilina@.gfi.com> wrote in message
> news:e$cXQaKnDHA.372@.TK2MSFTNGP11.phx.gbl...
> > Hi,
> >
> > we have a problem on one of our databases.. mainly the log is not
> truncating
> > after a backup.. By now it has grown to 3Gb, with the data file at
156Mb.
> >
> > Any Ideas?
> >
> > Thanks
> >
> > Nicholas
> >
> >
>

Monday, March 12, 2012

Logins Successful or Failed Auditing

Hello,
I have a task to do.
I need to get all the logins (successful or failed) into a table along with
the timestamp which I can keep truncating as and when I want so that it
doesn't grow too big.
I have tried using server side traces to do this. But this has 2
disadvantages which make me NOT want to use it.
1. The output file (.trc file) can be viewed only when the rollover size is
meet or when the SQL Server is stopped <--bad mojo.
2. There is no solution for when the output files keep on growing and
rolling over. I have 75+ servers to monitor. I want something that will keep
record for say, last 1 week thats all. Output in a table can be so so better.
Apparently, using server side trace you cant have output in table.
Does anyone have any suggestions ? I figured I could do something like put a
trigger on sysprocesses but this won't give me the failed logins.
Also, server load would be incredible with a trigger shooting off each time.
Any help is appreciated. Oh, and C2 Audit option is out of the question.
Thats just too detailed.
Thanks.
Regards,
Kunal
Hello,
The only thing I can suggest is that you can import the data from the .trc
file into a table using the function fn_trace_gettable (see BOL for exact
syntax).
Hope this helps.
"kunalap" wrote:

> Hello,
> I have a task to do.
> I need to get all the logins (successful or failed) into a table along with
> the timestamp which I can keep truncating as and when I want so that it
> doesn't grow too big.
> I have tried using server side traces to do this. But this has 2
> disadvantages which make me NOT want to use it.
> 1. The output file (.trc file) can be viewed only when the rollover size is
> meet or when the SQL Server is stopped <--bad mojo.
> 2. There is no solution for when the output files keep on growing and
> rolling over. I have 75+ servers to monitor. I want something that will keep
> record for say, last 1 week thats all. Output in a table can be so so better.
> Apparently, using server side trace you cant have output in table.
> Does anyone have any suggestions ? I figured I could do something like put a
> trigger on sysprocesses but this won't give me the failed logins.
> Also, server load would be incredible with a trigger shooting off each time.
> Any help is appreciated. Oh, and C2 Audit option is out of the question.
> Thats just too detailed.
> Thanks.
> Regards,
> Kunal
|||How about writing a small VB.NET app that is scheduled frequently and reds off of the event log and
import into a table?
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
"kunalap" <kunalap@.discussions.microsoft.com> wrote in message
news:DE02C02D-0C1D-447B-B98C-BE4F63F5A207@.microsoft.com...
> Hello,
> I have a task to do.
> I need to get all the logins (successful or failed) into a table along with
> the timestamp which I can keep truncating as and when I want so that it
> doesn't grow too big.
> I have tried using server side traces to do this. But this has 2
> disadvantages which make me NOT want to use it.
> 1. The output file (.trc file) can be viewed only when the rollover size is
> meet or when the SQL Server is stopped <--bad mojo.
> 2. There is no solution for when the output files keep on growing and
> rolling over. I have 75+ servers to monitor. I want something that will keep
> record for say, last 1 week thats all. Output in a table can be so so better.
> Apparently, using server side trace you cant have output in table.
> Does anyone have any suggestions ? I figured I could do something like put a
> trigger on sysprocesses but this won't give me the failed logins.
> Also, server load would be incredible with a trigger shooting off each time.
> Any help is appreciated. Oh, and C2 Audit option is out of the question.
> Thats just too detailed.
> Thanks.
> Regards,
> Kunal
|||Thanks for the reply Anoop.
I was already aware of the function. But it is only for viewing the data in
query analyzer.
I guess I will have to setup another job to delete the older .trc files.
Thanks.
-Kunal.
"Anoop" wrote:
[vbcol=seagreen]
> Hello,
> The only thing I can suggest is that you can import the data from the .trc
> file into a table using the function fn_trace_gettable (see BOL for exact
> syntax).
> Hope this helps.
>
> "kunalap" wrote:

Logins Successful or Failed Auditing

Hello,
I have a task to do.
I need to get all the logins (successful or failed) into a table along with
the timestamp which I can keep truncating as and when I want so that it
doesn't grow too big.
I have tried using server side traces to do this. But this has 2
disadvantages which make me NOT want to use it.
1. The output file (.trc file) can be viewed only when the rollover size is
meet or when the SQL Server is stopped <--bad mojo.
2. There is no solution for when the output files keep on growing and
rolling over. I have 75+ servers to monitor. I want something that will keep
record for say, last 1 week thats all. Output in a table can be so so better.
Apparently, using server side trace you cant have output in table.
Does anyone have any suggestions ? I figured I could do something like put a
trigger on sysprocesses but this won't give me the failed logins.
Also, server load would be incredible with a trigger shooting off each time.
Any help is appreciated. Oh, and C2 Audit option is out of the question.
Thats just too detailed.
Thanks.
Regards,
KunalHello,
The only thing I can suggest is that you can import the data from the .trc
file into a table using the function fn_trace_gettable (see BOL for exact
syntax).
Hope this helps.
"kunalap" wrote:
> Hello,
> I have a task to do.
> I need to get all the logins (successful or failed) into a table along with
> the timestamp which I can keep truncating as and when I want so that it
> doesn't grow too big.
> I have tried using server side traces to do this. But this has 2
> disadvantages which make me NOT want to use it.
> 1. The output file (.trc file) can be viewed only when the rollover size is
> meet or when the SQL Server is stopped <--bad mojo.
> 2. There is no solution for when the output files keep on growing and
> rolling over. I have 75+ servers to monitor. I want something that will keep
> record for say, last 1 week thats all. Output in a table can be so so better.
> Apparently, using server side trace you cant have output in table.
> Does anyone have any suggestions ? I figured I could do something like put a
> trigger on sysprocesses but this won't give me the failed logins.
> Also, server load would be incredible with a trigger shooting off each time.
> Any help is appreciated. Oh, and C2 Audit option is out of the question.
> Thats just too detailed.
> Thanks.
> Regards,
> Kunal|||How about writing a small VB.NET app that is scheduled frequently and reds off of the event log and
import into a table?
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
"kunalap" <kunalap@.discussions.microsoft.com> wrote in message
news:DE02C02D-0C1D-447B-B98C-BE4F63F5A207@.microsoft.com...
> Hello,
> I have a task to do.
> I need to get all the logins (successful or failed) into a table along with
> the timestamp which I can keep truncating as and when I want so that it
> doesn't grow too big.
> I have tried using server side traces to do this. But this has 2
> disadvantages which make me NOT want to use it.
> 1. The output file (.trc file) can be viewed only when the rollover size is
> meet or when the SQL Server is stopped <--bad mojo.
> 2. There is no solution for when the output files keep on growing and
> rolling over. I have 75+ servers to monitor. I want something that will keep
> record for say, last 1 week thats all. Output in a table can be so so better.
> Apparently, using server side trace you cant have output in table.
> Does anyone have any suggestions ? I figured I could do something like put a
> trigger on sysprocesses but this won't give me the failed logins.
> Also, server load would be incredible with a trigger shooting off each time.
> Any help is appreciated. Oh, and C2 Audit option is out of the question.
> Thats just too detailed.
> Thanks.
> Regards,
> Kunal|||Thanks for the reply Anoop.
I was already aware of the function. But it is only for viewing the data in
query analyzer.
I guess I will have to setup another job to delete the older .trc files.
Thanks.
-Kunal.
"Anoop" wrote:
> Hello,
> The only thing I can suggest is that you can import the data from the .trc
> file into a table using the function fn_trace_gettable (see BOL for exact
> syntax).
> Hope this helps.
>
> "kunalap" wrote:
> > Hello,
> >
> > I have a task to do.
> > I need to get all the logins (successful or failed) into a table along with
> > the timestamp which I can keep truncating as and when I want so that it
> > doesn't grow too big.
> >
> > I have tried using server side traces to do this. But this has 2
> > disadvantages which make me NOT want to use it.
> >
> > 1. The output file (.trc file) can be viewed only when the rollover size is
> > meet or when the SQL Server is stopped <--bad mojo.
> > 2. There is no solution for when the output files keep on growing and
> > rolling over. I have 75+ servers to monitor. I want something that will keep
> > record for say, last 1 week thats all. Output in a table can be so so better.
> > Apparently, using server side trace you cant have output in table.
> >
> > Does anyone have any suggestions ? I figured I could do something like put a
> > trigger on sysprocesses but this won't give me the failed logins.
> > Also, server load would be incredible with a trigger shooting off each time.
> >
> > Any help is appreciated. Oh, and C2 Audit option is out of the question.
> > Thats just too detailed.
> >
> > Thanks.
> >
> > Regards,
> > Kunal

Logins Successful or Failed Auditing

Hello,
I have a task to do.
I need to get all the logins (successful or failed) into a table along with
the timestamp which I can keep truncating as and when I want so that it
doesn't grow too big.
I have tried using server side traces to do this. But this has 2
disadvantages which make me NOT want to use it.
1. The output file (.trc file) can be viewed only when the rollover size is
meet or when the SQL Server is stopped <--bad mojo.
2. There is no solution for when the output files keep on growing and
rolling over. I have 75+ servers to monitor. I want something that will keep
record for say, last 1 week thats all. Output in a table can be so so better
.
Apparently, using server side trace you cant have output in table.
Does anyone have any suggestions ? I figured I could do something like put a
trigger on sysprocesses but this won't give me the failed logins.
Also, server load would be incredible with a trigger shooting off each time.
Any help is appreciated. Oh, and C2 Audit option is out of the question.
Thats just too detailed.
Thanks.
Regards,
KunalHello,
The only thing I can suggest is that you can import the data from the .trc
file into a table using the function fn_trace_gettable (see BOL for exact
syntax).
Hope this helps.
"kunalap" wrote:

> Hello,
> I have a task to do.
> I need to get all the logins (successful or failed) into a table along wit
h
> the timestamp which I can keep truncating as and when I want so that it
> doesn't grow too big.
> I have tried using server side traces to do this. But this has 2
> disadvantages which make me NOT want to use it.
> 1. The output file (.trc file) can be viewed only when the rollover size i
s
> meet or when the SQL Server is stopped <--bad mojo.
> 2. There is no solution for when the output files keep on growing and
> rolling over. I have 75+ servers to monitor. I want something that will ke
ep
> record for say, last 1 week thats all. Output in a table can be so so bett
er.
> Apparently, using server side trace you cant have output in table.
> Does anyone have any suggestions ? I figured I could do something like put
a
> trigger on sysprocesses but this won't give me the failed logins.
> Also, server load would be incredible with a trigger shooting off each tim
e.
> Any help is appreciated. Oh, and C2 Audit option is out of the question.
> Thats just too detailed.
> Thanks.
> Regards,
> Kunal|||How about writing a small VB.NET app that is scheduled frequently and reds o
ff of the event log and
import into a table?
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
"kunalap" <kunalap@.discussions.microsoft.com> wrote in message
news:DE02C02D-0C1D-447B-B98C-BE4F63F5A207@.microsoft.com...
> Hello,
> I have a task to do.
> I need to get all the logins (successful or failed) into a table along wit
h
> the timestamp which I can keep truncating as and when I want so that it
> doesn't grow too big.
> I have tried using server side traces to do this. But this has 2
> disadvantages which make me NOT want to use it.
> 1. The output file (.trc file) can be viewed only when the rollover size i
s
> meet or when the SQL Server is stopped <--bad mojo.
> 2. There is no solution for when the output files keep on growing and
> rolling over. I have 75+ servers to monitor. I want something that will ke
ep
> record for say, last 1 week thats all. Output in a table can be so so bett
er.
> Apparently, using server side trace you cant have output in table.
> Does anyone have any suggestions ? I figured I could do something like put
a
> trigger on sysprocesses but this won't give me the failed logins.
> Also, server load would be incredible with a trigger shooting off each tim
e.
> Any help is appreciated. Oh, and C2 Audit option is out of the question.
> Thats just too detailed.
> Thanks.
> Regards,
> Kunal|||Thanks for the reply Anoop.
I was already aware of the function. But it is only for viewing the data in
query analyzer.
I guess I will have to setup another job to delete the older .trc files.
Thanks.
-Kunal.
"Anoop" wrote:
[vbcol=seagreen]
> Hello,
> The only thing I can suggest is that you can import the data from the .trc
> file into a table using the function fn_trace_gettable (see BOL for exact
> syntax).
> Hope this helps.
>
> "kunalap" wrote:
>