|
Saturday, December 5th, 2015
|
1:55 pm - Data warehouse community
|
| Monday, February 11th, 2013
|
12:09 pm - Creating a "SQL Job Launch Shell" for lower-priveleged users
|

amyboyd
|
This is in response to my question on 2/4/2013 for SQL Version 2000 (should work in subsequent versions if you follow my comments)
Design: User Table Created w/ Trigger   ( the tSQL code...Collapse )
Procedures:- check_job_queue - fires off via scheduled SQL job. It reads from the prod_support_job_queue table
- make_job_request - procedure exposed to the production support team. This helps them insert records into the prod_support_job_queue table
- sp_isJobRunning - (Modified this procedure from THIS publicly available code in order for it to run on SQL 2000 )
Logic:- The user makes his request via the make_job_request stored procedure. He is required to enter a valid job name, action (which is either START, STOP, or CANCEL)
- check_job_queue runs every 10 minutes for check for new actions in the prod_support_job_queue table. It utilizes system stored procedures in msdb to start and stop jobs. For the CANCEL command, a simple update statement is issued to the processed field to exclude it from further processing checks.
- sp_IsJobRunning is called by check_job_queue in order to see if the requested job is already running before issuing any commands
I am adding fine-tuning to the check_job_queue procedure. Once that is done, I'll post the code for the two custom procedures check_job_queue and make_job_request
|
(comment on this)
|
| Monday, February 4th, 2013
|
10:14 am - SQL Job Administrators - SQL 2008 R2
|
| Friday, June 8th, 2012
|
6:37 pm - Question regarding Collations
|

amyboyd
|
Does anyone have any experience on this group in working with Unicode, double-byte case-sensitive data in SQL 2008 R2?
I would like to select a collation for my database that allows case-sensitive sorting/comparisons with Unicode data that could contain Japanese characters. Whew...that's hard to say.
current mood: tired
|
(3 comments | comment on this)
|
| Tuesday, February 28th, 2012
|
11:16 am - SQL Server SP2
|

evedana
|
Hi, I have multiple instances on SQL Server 2008. We are planning to install SP2 only on one instance. What impact will be for the rest of instances and especially on shared components. Thank you!!!
|
(2 comments | comment on this)
|
| Thursday, February 23rd, 2012
|
5:13 pm - Query fun and games
|

adina_atl
|
I've found in general for SQL that there is more than one way to solve (almost) any problem. I've been playing around with query building today and decided to see how many ways I could solve a problem that recurs fairly frequently in my work, flattening subrecords into a single row.
This is my current standard solution, using the PIVOT function. It's quite fast, but limits you to a specific number of subrecords--it can be a high number, but you still have to decide on a maximum.
WITH cte AS (SELECT Person.contactid AS 'ID' , Person.FullName AS 'Name' , 'Activity' = Activity.a422_rel_activityvalueidname , 'Row' = ROW_NUMBER() OVER (PARTITION BY Person.contactid, Person.FullName ORDER BY Activity.a422_rel_activityvalueidname) FROM Contact AS Person INNER JOIN Task AS Activity ON Person.contactid = Activity.regardingobjectid) SELECT ID, Name , 'Activity1' = [1], 'Activity2' = [2], 'Activity3' = [3], 'Activity4' = [4], 'Activity5' = [5] FROM cte PIVOT (MAX(cte.Activity) FOR cte.[Row] IN ([1], [2], [3], [4], [5])) AS pvt
( Other ways to get the same thingCollapse )
|
(comment on this)
|
| Monday, October 10th, 2011
|
4:49 pm - Tracking Database Growth
|
| Wednesday, July 20th, 2011
|
2:39 pm - DBA Position Open in North Texas
|
| Thursday, June 2nd, 2011
|
12:29 pm - Production SQL DBA Opening in North Texas
|

amyboyd
|
Passing this along for a friend...If you know anyone looking, please let me know. Pay terms seem to be a little higher than normal for that many years of experience. Responsibilities: - Installation, configuration, customization, maintenance and performance tuning of SQL Server 2005 & 2008 including SSIS, SSAS and SSRS.
- SQL version migration, patching and security management.
- Monitor database server capacity/performance and make infrastructure and architecture recommendations to management for necessary changes/updates.
- Perform database optimization, administration and maintenance (partitioning tables, partitioning indexes, indexing, normalization, synchronization, job monitoring, etc).
- Manage all aspects of database operations including implementation of database monitoring tools, event monitoring, diagnostic analysis, performance optimization routines and top-tier support for resolving support issues.
- Work with internal IT operations teams to troubleshoot network and server issues and optimize the database environment.
- Establish and enforce database change management standards including pushes from development to QA, on to production, etc;
- Proactively stay current with latest technologies and industry best practices associated to the position and responsibilities.
- Provide development and production support to troubleshoot day-to-day database or related application issues.
- Develop, implement and verify processes for system monitoring, storage management, backup and recovery.
- Develop, implement and verify database backup and disaster recovery strategies.
- Design and implement all database security to ensure integrity and consistency among the various database regions
- Develop and maintain documentation of the production environment.
- Manage SLAs and strict adherence to production controls - Sarbanes-Oxley (SOX) monitored via external audits
Necessary Qualifications:- Must have experience on SQL Server 2005.
- Good exposure on Installation, Configuration of database Clusters, Replication, Log shipping and Mirroring
- Expertise in Troubleshooting and performance monitoring SQL Server Database server (Query Tuning, Server Tuning, Disk Performance Monitoring, Memory Pressure, CPU bottleneck etc.)
- Expertise in T-SQL and writing efficient and highly performing SQL Statements.
- Expertise in SQL Server Internals, wait events, profiler, windows events etc
- Must have understanding of key infrastructure technologies such as Clustering, SAN Storage, Virtualization, Cloud services etc.
Other nice to have experience:- System administration fundamentals including Installation, Configuration & Security setups.
- Experience with SQL 2008 a plus.
- Experienced in architecting high availability, business resumption and disaster recovery solutions
- Microsoft SQL Server DBA Certification
- Experience with SCOM/SCCM/SCSM is a plus
- Extremely self motivated and ability to work within a globally dispersed team.
Desired Skills:- Data Warehouse experience
- VLDB experience highly desired
- Experience with databases > 5 TB, processing 2 million + rows of data daily
- MS SQL Server 2005 Transact-SQL (T-SQL)
- Stored Procedure Development Communication Skills, work well with the team, and within team processes
- Database and file size and space forecasting ability
- Ability to manage a complex database system and assist the client with Database Integration for Future Business Intelligence efforts
- Confio Ignite Performance
Education & Work Experience:- Bachelor's degree in Computer Science, Business Administration or other
- 10+ years experience as a Database Administrator
current mood: helpful
|
(comment on this)
|
| Thursday, May 12th, 2011
|
12:55 pm - Microsoft Tech-Ed
|
| Friday, May 6th, 2011
|
2:37 pm - Starting with the Specs: Building Solid Code Review Procedure
|
| Monday, April 18th, 2011
|
11:29 am - Building Solid Code Review Procedures
|

amyboyd
|
Does your organization use any code review procedures when promoting code from Development through to Production? If you work for a publicly-held corporation, you know this all to well. If you work for a privately-held company, you can benefit from this procedure as well. Wether you are a developer wanting to make sure all your i's are dotted and all your t's are crossed, or a DBA in operations getting ready to deploy new code, solid code review procedures can really save your butt from major mistakes that can cost you hours of work.
First, let's get a general idea of what a code review life cycle will look like and who is generally involved.
- Unit Test - Validate Business Requirements and Functional Context
- Unit Test - Validate Syntax, Entry and Exit points
- Unit Test & QA - Consistency - Perform Version Control & Transactional Analysis
- QA & Prod - Security - Permissions, Back-out and Error Handling
- QA & Prod - Performance - Validate Memory Usage, Thread Usage and Control Structure usage
- Prod - Documentation - Is there a header, change log, and comments ? Are deployment instructions required and included?
current mood: hopeful
|
(comment on this)
|
| Friday, March 18th, 2011
|
12:41 pm - Datafile Growth in SQL Server - Checking the Statistics
|

amyboyd
|
In our last two topics, we discussed gathering the space statistics.
Now we need to devise a procedure called sp_check_usage that will check our statistics and calculate when we expect to run out of space. We will also use a 2 new tables called drive_check_log and server_space_report to record our data for reporting purposes. For the first day, we will not have any data to report because we will need a second day's worth of statistics to compare the data against.
Now the new procedure will make a couple of passes at the stored data, use some temp tables to do a comparison, then will write a report you can send or review.
CREATE PROCEDURE [dbo].[sp_check_usage] as
set nocount on
declare @since_date datetime, @rc1 INT, @files INT, @cmd VARCHAR(256), @datfile1 VARCHAR(128), @subject nvarchar(500)
set @since_date = dateadd(d,-7,getdate())
SELECT @datfile1 = '\RPT' + CONVERT(CHAR(8),GETDATE(),112) + '.CSV'
-- group by days select t1.drive, datestamp=convert(varchar,datestamp,111), last_sample_id=max(t2.id), MBFreeSpace= sum(MBFreeSpace) / count(*), sample_count = count(*) into #x1 from free_space_log t1 inner join drive_check_log t2 on t1.id = t2.id where t2.datestamp >= @since_date group by t1.drive, convert(varchar,datestamp,111)
-- uncomment to debug --select * from #x1
-- get a days growth select t1.drive, avg_per_day = sum(t2.MBFreeSpace - t1.MBFreeSpace ) / count(*) into #x3 from #x1 t1 inner join #x1 t2 on t1.drive = t2.drive and t1.datestamp = convert(varchar,dateadd(d,1,t2.datestamp),111) group by t1.drive
select * from #x3
-- get the latest sample date select t1.drive, last_sample_id=max(t2.id), last_sample_date=max(t2.datestamp), CurrentMBFree=convert(float,0) into #x2 from free_space_log t1 (nolock) inner join drive_check_log t2 (nolock) on t1.id = t2.id group by t1.drive
-- set the current free space
update #x2 set CurrentMBFree = MBFreeSpace from free_space_log t1 (nolock) where #x2.drive = t1.drive and #x2.last_sample_id = t1.id
select * from #x2
--This is where the procedure produces the output
truncate table server_space_report
insert server_space_report ( drive, last_sample_date, avg_per_day, CurrentMBFree, days_left) select #x2.drive, #x2.last_sample_date, #x3.avg_per_day, #x2.CurrentMBFree, days_left = case when #x3.avg_per_day = 0 then 'low or no usage' when #x3.avg_per_day < 0 then 'negative usage' else 'Out of space in ' + replicate(' ', 6 - datalength(convert(varchar, round((#x2.CurrentMBFree / #x3.avg_per_day),0)))) + convert(varchar, round((#x2.CurrentMBFree / #x3.avg_per_day),0)) + ' days' end from #x2 inner join #x3 on #x2.drive = #x3.drive order by #x2.drive
-- uncomment to debug -- select * from server_space_report
set nocount off GO
** Please be responsible with free code. Test and check before implementing in a production environment
current mood: hungry
|
(1 comment | comment on this)
|
| Friday, March 11th, 2011
|
11:49 am - Datafile Growth in SQL Server - Getting the Statistics Part II
|

amyboyd
|
In our last entry we talked about getting datafile usage in SQL Server. Today, we'll implement sp_file_space in another stored procedure that combines it with the extended stored procedure xp_fideddrives to calculate the free space and stores the data in two standard tables.
CREATE PROCEDURE [dbo].[sp_log_spaceused] as create table #freespace ( drive char(1) null, MBfreespace bigint null )
set nocount on
delete from #freespace -- log this servers current space used insert into file_space_log exec.master.dbo.sp_file_space -- log the freespace insert into #freespace ( drive, MBfreespace ) exec master.dbo.xp_fixeddrives -- server_drive_space insert insert into free_space_log select drive, MBfreespace from #freespace
GO
** Please be responsible with free code. Test and check before implementing in a production environment
current mood: hungry
|
(comment on this)
|
| Thursday, March 10th, 2011
|
11:07 am - Datafile Growth in SQL Server - Getting the Statistics Part I
|

amyboyd
|
We create a database called ADMIN which stores our administrative information such as file space statistics. We use a combination of extended stored procedures and publicly-available code to log these statistics. Here is a samples:
/* Get current space statistics. You can run this and store the results in a holding table. */
CREATE PROCEDURE [dbo].[sp_file_space] @server_name sysname, @id int as
declare @dbname sysname declare @cmd varchar(700), @lname_len int, @fname_len int, @fgroup_len int
set nocount on
select @cmd = 'use [?] select ' + convert(varchar,@id) + ',''' + rtrim(@server_name) + ''',db_name(),logical_name = name,fileid,upper(substring(filename,1,1)),filename, filegroup = filegroup_name(groupid), size_in_KB = size * 8, maxsize_in_KB = (case maxsize when -1 then 0 else maxsize * 8 end), growth = (case status & 0x100000 when 0x100000 then growth else growth * 8 end), KB_growth_flag = (case status & 0x100000 when 0x100000 then 0 else 1 end) , usage = (case status & 0x40 when 0x40 then ''Log Only'' else ''Data Only'' end) from sysfiles order by fileid'
exec sp_MSforeachdb @command1=@cmd
return 0
GO
** Please be responsible with free code. Test and check before implementing in a production environment
current mood: busy
|
(comment on this)
|
| Wednesday, March 9th, 2011
|
10:22 am - How do you track datafile growth?
|

amyboyd
|
Here's a good question for data environments today. What methods do you employ to track datafile growth in your SQL Server databases? Do you use a 3rd-party tool, or do you have a home-brew method? I'll share my method once we read about other's ideas.
current mood: chipper
|
(comment on this)
|
| Tuesday, February 15th, 2011
|
11:42 am - How do you promote scripts?
|

amyboyd
|
It looks like we haven't had much discussion here in quite a while, so as the community owner, I will try to stir some discussion.
How do you promote your SQL scripts throughout your development, test, and prod environments?
current mood: curious
|
(12 comments | comment on this)
|
| Monday, November 8th, 2010
|
2:28 pm - Need a hack for changing default db
|

amyboyd
|
I have a user who locked himself out of a database because his default db went into suspect mode. His security policy was nice enough to bar anyone in the Windows Administrators group from logging in to the db. He says he can't remember the two passwords for the administrative logins currently assigned to System Administrator role on the server. Any hope here? I think he's screwed, personally.
current mood: confused
|
(1 comment | comment on this)
|
| Tuesday, September 28th, 2010
|
9:43 am - How to make an Oracle Linked server on SQL 2000
|

amyboyd
|
Posting for the benefit of all after personally pulling my hair out on this one...
1) Installed the Oracle Instant Client following these wonderful directions: http://www.dbatoolz.com/t/installing-oracle-instantclient-basic-and-instantclient-sqlplus-on-win32.html
2) Restarted my SQL server to re-load the ODBC drivers
3) Created linked server
USE master go EXEC sp_addlinkedserver @server=N'ORCL_SRVR', @srvproduct=N'', @provider=N'MSDASQL', @datasrc=N'ORCL_SRVR' go EXEC sp_serveroption @server=N'ORCL_SRVR', @optname='rpc', @optvalue='true' go EXEC sp_serveroption @server=N'ORCL_SRVR', @optname='collation compatible', @optvalue='false' go EXEC sp_serveroption @server=N'ORCL_SRVR', @optname='data access', @optvalue='true' go EXEC sp_serveroption @server=N'ORCL_SRVR', @optname='rpc out', @optvalue='false' go EXEC sp_serveroption @server=N'ORCL_SRVR', @optname='use remote collation', @optvalue='true' go EXEC sp_addlinkedsrvlogin @rmtsrvname=N'ORCL_SRVR', @useself='FALSE', @rmtuser=N'system', @rmtpassword=N'my password' go IF EXISTS (SELECT * FROM master.dbo.sysservers WHERE srvname=N'ORCL_SRVR') PRINT N'<<< CREATED LINKED SERVER ORCL_SRVR >>>' ELSE PRINT N'<<< FAILED CREATING LINKED SERVER ORCL_SRVR >>>' go
|
(2 comments | comment on this)
|
| Tuesday, February 2nd, 2010
|
1:35 pm
|

evedana
|
Hi! I have multiple SQL Server 2005 instances installed on a server. After I removed one of them the MS SQL Server 2005 disapper from Add\Remove program. Did you have this problem before? I need to remove another instance. Thank you in advance.
|
(comment on this)
|
|
|
|
|