Dealing with the Bullshitter in Your Life

Taming the Tornado of Talk: Dealing with the Bullshitter in Your Life

We all know them: the masters of hot air, the weavers of tales spun from thin thread, the champions of empty pronouncements. The bullshitter, in all their blustery glory, can be a frustrating force to navigate, whether they’re a colleague, a friend, or even (gulp) a family member. But fear not, truth-seekers! Here’s a guide to help you manage the BS and keep your sanity intact:

Step 1: Recognize the Spin:

First things first, identify the bullshitter in your midst. Listen for telltale signs like grandiose claims made without evidence, a penchant for exaggeration, and an uncanny ability to avoid specifics. Watch for vagueness, deflection, and an unhealthy dose of self-importance. Once you’ve spotted the BS tornado, it’s time to take cover.

Step 2: Ask the Right Questions:

Don’t let the bluster sweep you away. Challenge the bullshitter’s statements with open-ended questions that demand specifics. Ask for evidence, examples, and concrete details. Be polite, but firm. Remember, you’re not accusing them of lying, you’re simply requesting clarity. A true BS-artist will flounder under the spotlight of inquiry.

Step 3: Set Boundaries:

Sometimes, engaging with a bullshitter is a losing battle. Learn to recognize your limits and politely excuse yourself from conversations that go nowhere. Set boundaries and stick to them. If a colleague drones on about their “revolutionary” marketing plan, gently suggest scheduling a dedicated meeting to discuss specifics. With friends and family, you can be more direct, gently redirecting the conversation or even taking a time-out.

Step 4: Humor, Your Secret Weapon:

A well-placed dose of humor can be a surprisingly effective tool against the bullshitter. A lighthearted quip or a playful observation can deflate their inflated ego and bring the conversation back down to earth. Just be sure to keep it good-natured and avoid sarcasm, which can backfire.

Step 5: Lead by Example:

Ultimately, the best way to combat BS is to embody truth yourself. Be clear, concise, and honest in your own communication. When others see your commitment to authenticity, it sets a standard and encourages them to do the same. Remember, silence can be a powerful tool too. Sometimes, simply not engaging with the BS is the most eloquent response.

Dealing with a bullshitter can be a test of patience, but with a little strategy and humor, you can navigate the storm and emerge unscathed. Remember, truth is a beacon in the fog of BS, and it’s your job to keep it shining bright. Now go forth, armed with questions, wit, and a healthy dose of skepticism, and conquer the blustery world of bullsh*t once and for all!

Bonus Tip: Don’t forget to take care of yourself! Dealing with negativity can be draining, so make sure to prioritize your own well-being. Take breaks, engage in activities you enjoy, and surround yourself with positive people. Remember, you can’t control the BS others spew, but you can control how you react to it.

Backup database to a static file and overwrite the backup file every time the backup job runs

I had a request from a business user to create a job that they want to run on an ad-hoc basis to backup a database to a static file name. And they backup file needed to be deleted/overwritten every time the sql job ran. I used Ola Hallengren’s backup code to take the database backup with a static file name.

EXECUTE [dbo].[DatabaseBackup] 
@Databases = 'DatabaseName', 
@Directory = N'E:\DeployBackups', 
@BackupType = 'FULL', 
@Verify = 'Y', 
@CheckSum = 'Y',
@Compress = 'Y',
@CleanupMode = 'BEFORE_BACKUP',
@Init = 'Y',
@FileName = '{DatabaseName}.{FileExtension}'

@Databases = specify the name of the database
@Directory = the directory where the backup file will be written to
@BackupType = what type of backup – FULL, DIFF or LOG
@Verify = verify the backup file after it is completed
@CheckSum = check the checksum of the database file
@Compress = compress the backup file
@CleanupMode = when should the old backup file will be deleted – before or after the current backup job is completed. If you have limited space on the drive then it is a good option to delete the old backup file before the new backup is started but you run the risk of losing all the backup files incase the current backup job fails.
@Init = ‘Y’, or ‘N’ specifies if the current backup files will be deleted. It is important to note that we need to use this option since we are backing up the database to a static file name. If you do not use this option then the backup will be appended with the new backup and the size of the backup file will be growing.
@FileName = ‘{DatabaseName}.{FileExtension}’ this option specifies the exact name of the database file every time the backup runs.

Hope this helps, good luck.

Powershell script to clean up files from a specific directory

I came across an issue where the sql backup job that I had scheduled was not cleaning up the old backup files from the backup directory. Although I had specified a file clean up parameter of 1 day but I realized the issue when the backup drive ran out of disk space although I made sure I had provisioned double the space requirement on the drive space.

After investigating the issue I realized that the backup job was not cleaning out the old files from the backup directory. I knew I had to troubleshoot the issue on why the sql job was not cleaning out the old files but I was in the middle of a different production outage and did not have time to work on the cleanup job. And the next scheduled backup was going to start in a couple of hours.

As a quick fix I ran this powershell script to quickly clean out the backup drive so that when the next scheduled backup started – it will be able to run successfully with no issues.

Here is the powershell code I used:

#Parameters
$Path = "B:\Backups" # Path where the file is located
$Days = "1" # Number of days before current date
 
#Calculate Cutoff date
$CutoffDate = (Get-Date).AddDays(-$Days)
 
#Get All Files modified more than the last 30 days
Get-ChildItem -Path $Path -Recurse -File | Where-Object { $_.LastWriteTime -lt $CutoffDate } | Remove-Item –Force -Verbose

The above code will cleanup all files in the specified directory older than 1 day.
You can run this in Powershell ISE on the server itself or save this as a .ps1 file and run it from the powershell command window.

Hope this helps. Good Luck!

Find a specific stored procedure in all databases

Got a request to find a specific stored procedure in all the databases.

Instead of scrolling thru each database and stored procedures manually to find the stored procedure, here is a SQL script that will search all the databases and list the databases that have that stored procedure. Hope this helps.

--Search for a specific stored procedure in all the databases on the sql instance
DECLARE @SQL NVARCHAR(max)
    ,@spName VARCHAR(200) = 'name of stored procedure' -- The name of the procedure you are looking for

SELECT @SQL = STUFF((
            SELECT CHAR(10) + ' UNION ALL '           + CHAR(10) +  
' SELECT ' + quotename(NAME, '''') + ' AS DB_NAME '   + CHAR(10) + 
'         , SCHEMA_NAME(s.schema_id)  AS THE_SCHEMA ' + CHAR(10) + 
'         , s.name  COLLATE Latin1_General_CI_AS AS THE_NAME  ' + CHAR(10) + 
'  FROM ' + quotename(NAME) + '.sys.procedures s '    + CHAR(10) +   
' WHERE s.name = @spName 
  AND s.[type] = ''P'''
            FROM sys.databases
            ORDER BY NAME
            FOR XML PATH('')
                ,TYPE
            ).value('.', 'nvarchar(max)'), 1, 11, '')

--PRINT @SQL

EXECUTE sp_executeSQL @SQL
    ,N'@spName varchar(200)'
    ,@spName

Get size of all tables in database

Sometimes I receive a request from client to get a list of all the tables and it’s data size.

Here is a helpful sql script that will get the info you need.

Copy the results to an excel file.

SELECT 
    t.name AS TableName,
    s.name AS SchemaName,
    p.rows,
    SUM(a.total_pages) * 8 AS TotalSpaceKB, 
    CAST(ROUND(((SUM(a.total_pages) * 8) / 1024.00), 2) AS NUMERIC(36, 2)) AS TotalSpaceMB,
    SUM(a.used_pages) * 8 AS UsedSpaceKB, 
    CAST(ROUND(((SUM(a.used_pages) * 8) / 1024.00), 2) AS NUMERIC(36, 2)) AS UsedSpaceMB, 
    (SUM(a.total_pages) - SUM(a.used_pages)) * 8 AS UnusedSpaceKB,
    CAST(ROUND(((SUM(a.total_pages) - SUM(a.used_pages)) * 8) / 1024.00, 2) AS NUMERIC(36, 2)) AS UnusedSpaceMB
FROM 
    sys.tables t
INNER JOIN      
    sys.indexes i ON t.object_id = i.object_id
INNER JOIN 
    sys.partitions p ON i.object_id = p.object_id AND i.index_id = p.index_id
INNER JOIN 
    sys.allocation_units a ON p.partition_id = a.container_id
LEFT OUTER JOIN 
    sys.schemas s ON t.schema_id = s.schema_id
WHERE 
    t.name NOT LIKE 'dt%' 
    AND t.is_ms_shipped = 0
    AND i.object_id > 255 
GROUP BY 
    t.name, s.name, p.rows
ORDER BY 
    TotalSpaceMB DESC, t.name

Hope this helps

Disable Foreign Keys

I received a request to export data from a table in a database in Production to a similar table in a database in the Development environment. I used the export/import wizard thru SQL Server Management Studio but my export was failing giving me an error that the data cannot be copied because Foreign Key was present in the destination database. In the past I would just script out a drop and create script for all the foreign keys then drop all the Foreign Keys, do the data export and then re-create the Foreign Keys. After some research online I came across a better option to just disable the Foreign Keys instead of dropping and recreating them.

But first lets understand what is a Primary Key and Foreign Key.

In SQL Server, a primary key is a single field that has a unique value to define a record. Fields that are part of the primary key cannot contain a null value. A table can have only one primary key. Usually the primary key is used as an index but this can vary.

A table can have only ONE primary key and this primary key can consist of single or multiple columns (fields).

Since primary key constraints ensure unique data, they are often called identity columns.

When you designate a primary key constraint for a table, the SQL engine enforces data uniqueness by auto create a unique index for the primary key columns.

A foreign key is a column or set of columns that allows developers to establish a referential link between the data in two different tables. This link helps to match the foreign key column data with the data of the referenced table data. The referenced table is called the parent table and the table that involves a foreign key is called the child table. In addition, if a foreign key references another column of the same table, this reference type is called a self-reference.

A FOREIGN KEY is a field (or collection of fields) in one table, that links to the PRIMARY KEY in another table.

The table with the foreign key is called the child table, and the table with the primary key is called the referenced table (parent table).

The FOREIGN KEY constraint prevents invalid data from being inserted into the foreign key column in the child table, because it has to be one of the values contained in the parent table.

Based on the developer’s coding standard – usually it is a good practice to prefix with FK_{FK name} and the same goes with Primary Keys being prefixed with PK_{PK Name}

The following SQL query creates a FOREIGN KEY on the “PersonID” column in the Persons table when the “Orders” table is created:

CREATE TABLE Orders (

OrderID int NOT NULL,

OrderNumber int NOT NULL,

PersonID int,

PRIMARY KEY (OrderID),

FOREIGN KEY (PersonID) REFERENCES Persons(PersonID)

);

If the Orders table is already created then use this SQL query to create a FOREIGN KEY constraint on the “PersonID” column:

ALTER TABLE Orders

ADD FOREIGN KEY (PersonID) REFERENCES Persons(PersonID);

If you need to name a Foreign Key constraint and to specify a Foreign Key constraint on multiple columns, use this SQL Query:

ALTER TABLE Orders

ADD CONSTRAINT FK_PersonOrder

FOREIGN KEY (PersonID) REFERENCES Persons(PersonID);

You can disable a Foreign Key in a table using the Alter Table statement in SQL Server Management Studio. Here is the syntax to disable a foreign key in SQL Server (T-SQL):

ALTER TABLE [your_table_name]

NOCHECK CONSTRAINT [your_fk_name];

Parameters/Syntax:

your_table_name

The name of the table where the foreign key has been created.

your_fk_name

The name of the foreign key that you wish to disable.

The above script would use the ALTER TABLE statement to disable the constraint called fk_inyour_fk_name on the your_table_name table.

After you have disabled the Foreign Key then you should be able to do your data load using the Daa with no error.

To disable all constraints

— disable all constraints

EXEC sp_MSforeachtable “ALTER TABLE ? NOCHECK CONSTRAINT all”

To turn the constraints back on – the print command is optional and it is just for listing the database tables.

Run this:

— enable all constraints

exec sp_MSforeachtable @command1=”print ‘?'”, @command2=”ALTER TABLE ? WITH CHECK CHECK CONSTRAINT all”

To disable the constraints is much helpful when you have to copy data from one database to another. I prefer this then dropping constraints.

If you have triggers in the database then you will have to disable the triggers prior to your data load and then add the triggers back on once the data load is completed.

To disable all constraints and triggers run this:

sp_msforeachtable “ALTER TABLE ? NOCHECK CONSTRAINT all”

sp_msforeachtable “ALTER TABLE ? DISABLE TRIGGER all”

To enable all constraints and triggers run this:

exec sp_msforeachtable @command1=”print ‘?'”, @command2=”ALTER TABLE ? WITH CHECK CHECK CONSTRAINT all”

sp_msforeachtable @command1=”print ‘?'”, @command2=”ALTER TABLE ? ENABLE TRIGGER all”

The word of caution is disabling constraints and triggers – to you have make sure there are no new deltas being written to the database by the users because once you disable all the constraints and triggers any new deltas written to the database that might violate the integrity of the database. Hence you have to ensure that all application traffic is stopped.

Also if you need to import a large amount of data, then consider using BULK INSERT because this method does not fire the triggers. However after your bulk insert it completed, you will need to fix any data integrity issues which occurred during your bulk insert that circumvented the trigger policies.

Hope this helps clarify the concept of Primary Keys, Foreign Keys and Constraints.

Unable to open SQL Server Configuration Manager. When I click on Configuration manager I get the error: Cannot connect to WMI provider

I rdp’d to the sql server and to check which sql services are installed on the server – I tried to open SQL Configuration manager but I was unable to open SQL Server Configuration Manager. When I clicked on Configuration manager I get the error: Cannot connect to WMI provider.
You do not have permission or the server is unreachable. Note that you can only manage SQL Server 2005 and later servers with SQL Server Configuration Manager.

The first thing I did was to check if I have admin rights to the server.

How to fix the WMI Provider Error:
Open a command prompt in the location machine (run as administrator)
Then run the following command according to the SQL Server version which was installed on the machine.

SQL Server 2005
mofcomp “%programfiles(x86)%\Microsoft SQL Server\90\Shared\sqlmgmproviderxpsp2up.mof”

SQL Server 2008 / R2
mofcomp “%programfiles(x86)%\Microsoft SQL Server\100\Shared\sqlmgmproviderxpsp2up.mof”

SQL Server 2012
mofcomp “%programfiles(x86)%\Microsoft SQL Server\110\Shared\sqlmgmproviderxpsp2up.mof”

SQL Server 2014
mofcomp “%programfiles(x86)%\Microsoft SQL Server\120\Shared\sqlmgmproviderxpsp2up.mof”

SQL Server 2016
mofcomp “%programfiles(x86)%\Microsoft SQL Server\130\Shared\sqlmgmproviderxpsp2up.mof”

SQL Server 2017
mofcomp “%programfiles(x86)%\Microsoft SQL Server\140\Shared\sqlmgmproviderxpsp2up.mof”

SQL Server 2019
mofcomp “%programfiles(x86)%\Microsoft SQL Server\150\Shared\sqlmgmproviderxpsp2up.mof”

SQL Server 2022
mofcomp “%programfiles(x86)%\Microsoft SQL Server\160\Shared\sqlmgmproviderxpsp2up.mof”

If you run the above script for the correct sql version that is installed on your server then you will get the following message:

Hope this helps people resolve the issue they get when they try to open SQL Server configuration manager and get a WMI error.

Grant Users permission to edit SQL Job Schedules

If a non-sysadmin users requests permission to modify sql job schedules that are not the owner of then you can do the following:

Grant the user execute permission to

sp_update_job
sp_update_jobschedule
sp_update_jobstep

Here is the TSQL for it:

GRANT EXECUTE ON sp_update_job to [username]
GRANT EXECUTE ON sp_update_jobschedule to [username]
GRANT EXECUTE ON sp_update_jobstep to [username]

I also tried granting the user db_owner to the msdb database. But the user was still not able to edit the sql server job schedule.

The user kept getting this error:

On researching the error code, there seems to be no other option but to grant the user sysadmin permission to be able to edit the sql job schedule for all the jobs on the sql instance.

Only sysadmin role members can edit and run jobs owned by others.

Then I came across this info:

Grant execute permission to these stored procs

GRANT EXECUTE ON sp_update_job to [username]
GRANT EXECUTE ON sp_update_jobschedule to [username]
GRANT EXECUTE ON sp_update_jobstep to [username]

GRANT EXECUTE ON sp_add_job to [username]
GRANT EXECUTE ON sp_add_jobstep to [username]
GRANT EXECUTE ON sp_add_jobschedule to [username]
GRANT EXECUTE ON sp_update_job to [username]
GRANT EXECUTE ON sp_update_jobstep to [username]
GRANT EXECUTE ON sp_update_jobschedule to [username]

GRANT EXECUTE ON sp_help_job to [username]
GRANT EXECUTE ON sp_help_jobstep to [username]
GRANT EXECUTE ON sp_update_jobschedule to [username]

GRANT EXECUTE ON sp_delete_jobschedule to [username]
GRANT EXECUTE ON sp_help_jobhistory to [username]
GRANT EXECUTE ON sp_start_job to [username]
GRANT EXECUTE ON sp_stop_job to [username]

GRANT EXECUTE ON sp_delete_job to [username]
GRANT EXECUTE ON sp_delete_jobstep to [username]

And it worked!! so no need to grant the user sysadmin rights to the sql instance.

A Dazzling Oasis in the Desert

A Dazzling Oasis in the Desert: Unveiling the Palm Jumeirah

Dubai, the land of shimmering skyscrapers and audacious ambition, boasts many marvels. But none quite capture the city’s spirit like the Palm Jumeirah, a breathtaking archipelago of artificial islands shaped like a palm tree. This man-made wonder, visible even from space, is a testament to human ingenuity and a playground for the rich and famous.

A Story of Sand and Innovation:

Born from a vision to expand Dubai’s coastline and attract tourism, the Palm Jumeirah’s construction began in 2001. Billions of tons of sand were dredged from the seabed and meticulously sculpted into the iconic palm shape. The “trunk” connects to the mainland, while the 17 “fronds” form exclusive residential neighborhoods and luxurious resorts. The surrounding crescent, aptly named The Crescent, is home to iconic hotels like the Atlantis The Palm, with its underwater suites and thrilling water park.

A Paradise for Luxury Seekers:

Stepping onto the Palm Jumeirah is like entering a world of extravagance. Five-star hotels like One & Only The Palm and Jumeirah Zabeel Saray line the fronds, offering opulent accommodations, private beaches, and infinity pools with breathtaking views. Michelin-starred restaurants, designer boutiques, and opulent spas cater to every whim. For a touch of adventure, skydiving over the island or embarking on a dolphin-watching cruise are just a few of the possibilities.

Beyond the Glitz:

But the Palm Jumeirah isn’t just about glitz and glamour. Public beaches like Umm Suqeim Park offer a more relaxed atmosphere, while The Pointe, a waterfront promenade, bustles with trendy restaurants and lively bars. Families flock to Aquaventure Waterpark, a sprawling aquatic wonderland with thrilling slides and lazy rivers. And for a glimpse into local life, head to the traditional souk on the trunk, where you can bargain for spices, perfumes, and handicrafts.

A Controversial Gem:

The Palm Jumeirah’s environmental impact has been a subject of debate. The construction disrupted marine life and raised concerns about rising sea levels. However, the island’s developers have implemented measures to mitigate these effects, including the creation of artificial reefs and marine sanctuaries.

A Must-See for Any Traveler:

Despite the debates, the Palm Jumeirah remains a remarkable engineering feat and a popular tourist destination. Whether you’re seeking ultimate luxury, family fun, or a glimpse into Dubai’s audacious spirit, this dazzling oasis in the desert promises an unforgettable experience.

So, pack your bags, dust off your sunglasses, and get ready to be awestruck by the Palm Jumeirah. This man-made marvel is waiting to be discovered.

I hope you enjoyed this glimpse into the dazzling world of the Palm Jumeirah.

Teens and Screens

Teens and Screens: Finding Balance in a Digital World

Teenagers and screens – it’s a relationship that can be both rewarding and challenging. From staying connected with friends to exploring endless entertainment options, screens offer a world of possibilities. But with great power comes great responsibility, and for teens, finding a healthy balance between screen time and real-world experiences is crucial.

So, how do we, as adults, advise teenagers on navigating this digital landscape? Here are some tips:

1. Open Communication is Key:

Instead of dictating screen time limits, initiate open and honest conversations. Talk about the pros and cons of screen time, and listen to their thoughts and concerns. This approach fosters trust and empowers teens to make responsible choices.

2. Highlight the Importance of Real-World Experiences:

Encourage teens to participate in activities that get them away from screens. Sports, hobbies, spending time with friends and family – these real-world experiences contribute to healthy social, emotional, and physical development.

3. Lead by Example:

Teens are keen observers, and our own screen habits have a significant impact. Be mindful of your screen time, especially in their presence. Put your phone away during meals and family time, and prioritize real-world interactions.

4. Set Ground Rules Together:

Collaboratively establish screen time guidelines that work for everyone. This could involve setting limits on total screen time, designating screen-free zones like bedrooms, and establishing technology-free times before bed.

5. Make Screen Time Meaningful:

Not all screen time is created equal. Encourage teens to use screens for educational purposes, creative activities, and connecting with loved ones. Help them discover educational apps, documentaries, and online communities that enrich their lives.

6. Promote Digital Wellness:

Talk about the importance of online safety, responsible social media use, and cyberbullying awareness. Guide them on setting privacy settings, protecting their information, and being mindful of their online footprint.

7. Be Patient and Supportive:

Changing habits takes time and effort. Be patient with teens as they adjust to new screen time guidelines. Offer support and encouragement, and celebrate their successes along the way.

Remember, finding a healthy balance with screen time is a journey, not a destination. By fostering open communication, setting clear expectations, and providing positive reinforcement, we can help teenagers navigate the digital world responsibly and develop healthy habits for life.

Bonus Tip: Utilize screen time management apps and tools available on most devices. These tools can help track usage, set time limits, and even block certain websites or apps during designated times.

By working together, we can ensure that screens become a tool for good, enriching the lives of teenagers without compromising their well-being and development.

How to Handle a Lying Co-worker

Working with a Pinocchio in the Office: How to Handle a Lying Co-worker

We’ve all encountered them: the office fibber, the master of tall tales, the colleague whose truth meter perpetually flickers between “mostly false” and “outright fabrication.” Working with a lying co-worker can be a major headache, eroding trust, creating tension, and potentially hindering your own work. But before you storm into your boss’s office demanding a lie detector test for everyone, take a deep breath and consider these strategies:

1. Observe and Assess:

Don’t jump to conclusions based on a single white lie about weekend plans. Instead, observe your co-worker’s behavior over time. Look for patterns in their fibs. Are they harmless exaggerations, or do they directly impact your work or reputation? Are they compulsive, or situational (like hiding a late-night pizza run from the boss)? Gathering intel will help you determine the best course of action.

2. The Direct Approach (but Tread Carefully):

If the lies are directly affecting you or your work, consider a private, calm conversation. Express your concerns without personal attacks or accusations. Stick to facts and specific examples: “I noticed you told the client the project would be finished by Friday, but my timeline shows next week. Can we discuss a more realistic deadline?”

3. The Indirect Approach: Let the Truth Set You Free:

Sometimes, the best way to combat a lie is to simply present the truth. If your co-worker claims to have aced a presentation they clearly bombed, don’t engage in their fantasy. Offer constructive feedback based on reality, or simply redirect the conversation to actionable takeaways. The truth, presented calmly and professionally, can be a powerful tool.

4. Document and Report (When Necessary):

If the lies are serious, malicious, or jeopardizing the company or client relationships, documenting and reporting them to your supervisor or HR might be necessary. Keep a record of dates, times, and specific instances of the lies, along with any witnesses or evidence. Remember, this should be a last resort, as it can escalate the situation and damage workplace relationships.

5. Protect Yourself:

While addressing the behavior is important, don’t let the lies consume you. Don’t share confidential information with your untrustworthy co-worker, and be cautious about collaborating on projects. Focus on your own work ethic and maintain a positive, professional attitude. Remember, you can’t control others’ actions, but you can control your own.

Bonus Tip: Remember, context matters. A small, inconsequential lie about catching the last bus is different from a fabrication that impacts deadlines or puts someone’s job at risk. Use your judgment and prioritize addressing the lies that truly matter.

Living with a workplace Pinocchio can be a challenge, but by staying calm, collected, and strategic, you can navigate the situation effectively. Remember, communication, clear boundaries, and a focus on your own work ethic are key to maintaining your sanity and productivity in the face of office fibs.

I hope this article helps you deal with your lying co-worker! Remember, open and honest communication is always the best policy, even if it’s a little awkward at first. With a little effort, you can create a more truthful and trusting work environment for everyone.

Top 10 things to do in Dubai

Dubai is a city of superlatives. Home to the world’s tallest building, the Burj Khalifa, the largest indoor ski slope, Ski Dubai, and the most luxurious hotels, Dubai is a must-visit for any traveler looking for a truly unforgettable experience.

But beyond the glitz and glamour, there’s also a rich culture and history to be discovered in Dubai. From the traditional souks (markets) of Deira to the stunning Sheikh Zayed Grand Mosque, there’s something for everyone in this fascinating city.

Here are the top 10 things to do in Dubai:

Visit the Burj Khalifa
No trip to Dubai would be complete without a visit to the Burj Khalifa. At 828 meters tall, it’s the tallest building in the world, and the views from the observation deck on the 124th floor are simply breathtaking.

Go shopping at the Dubai Mall
The Dubai Mall is not just a shopping mall, it’s a world unto itself. With over 1,300 stores, an aquarium, an ice rink, and a cinema, you could easily spend a whole day here.

Explore the souks of Deira
The souks of Deira are a great place to experience traditional Dubai. Here you can find everything from gold and spices to textiles and souvenirs. Be sure to haggle for the best price!

Take a dhow cruise on Dubai Creek
A dhow cruise is a great way to see the sights of Dubai from a different perspective. You’ll float past the traditional wooden dhows, the souks, and the Sheikh Zayed Road.

Visit the Sheikh Zayed Grand Mosque
The Sheikh Zayed Grand Mosque is one of the most beautiful mosques in the world. It’s made of white marble and can accommodate up to 40,000 worshipers.

Ski at Ski Dubai
Ski Dubai is an indoor ski resort located in the Mall of the Emirates. It’s a great place to cool off if you’re visiting Dubai during the summer months.

Relax on the beach
Dubai has some of the most beautiful beaches in the world. Jumeirah Beach is a popular spot for swimming, sunbathing, and water sports.

Go on a desert safari
A desert safari is a must-do activity for any visitor to Dubai. You’ll get to go dune bashing, camel riding, and enjoy a traditional barbecue dinner under the stars.

See the Dubai Fountain
The Dubai Fountain is the world’s tallest choreographed fountain system. It’s located in downtown Dubai and puts on a spectacular show every evening.

Visit the Dubai Aquarium & Underwater Zoo
The Dubai Aquarium & Underwater Zoo is home to over 140,000 marine animals. You can walk through a tunnel and see sharks, stingrays, and other creatures swimming overhead.

These are just a few of the many things to do in Dubai. With so much to see and do, you’re sure to have an unforgettable time in this amazing city.

I hope this article has been helpful on your trip planning.

Does my Azure SQL Managed Instance has a read-only replica

Here are two SQL scripts you can use to check if your Azure SQL Managed Instance has a read-only replica:

1. Using sys.dm_fts_mirroring_endpoints:

SELECT *
FROM sys.dm_fts_mirroring_endpoints
WHERE mirror_state_desc = 'Synchronizing'
AND is_local = 0;

This script queries the sys.dm_fts_mirroring_endpoints dynamic management view, which contains information about mirroring and read-only replica endpoints. It filters for endpoints that are currently in the “Synchronizing” state and are not local (indicating a remote replica). If any results are returned, then your Managed Instance has at least one read-only replica.

2. Using sys.databases:

SELECT name, is_read_only_replica
FROM sys.databases
WHERE is_read_only_replica = 1;

This script queries the sys.databases system catalog view and filters for databases where the is_read_only_replica column is set to 1. This will directly show you the names of any read-only replica databases on your Managed Instance.

Both scripts will provide you with information about your read-only replicas, if any. The first script gives more details about the endpoint configuration, while the second script simply shows the names of the replica databases. Choose whichever script best suits your needs.

Additional notes:

  • These scripts only work on Azure SQL Managed Instances. They will not work on on-premises SQL Servers.
  • You need to have the necessary permissions to execute these scripts. The minimum required permission is the VIEW SERVER STATE server-level permission.
  • The first script might display results even if you have a secondary replica in a failover group, not a read-only replica. To distinguish between the two, you can check the endpoint_type column in the returned result set. A value of ‘2’ indicates a read-only replica.

I hope this helps!

Azure SQL Managed Instance Read-Only Replica


Feeling the heat of heavy read loads on your Azure SQL Managed Instance? Introducing the read-only replica, your knight in shining armor for scaling read performance and offloading pressure from your primary database. Buckle up, as we explore the magic of this powerful tool!

What is a Read-Only Replica?

The read-only replica is a near-exact copy of your primary database, maintained in real-time through continuous synchronization. Its sole purpose? Serving up blazing-fast read queries without impacting write operations on the primary. Think of it as a dedicated reading room for your busy database, allowing your primary instance to focus on write tasks without getting bogged down by read requests.

Benefits of Read-Only Replicas:

  • Enhanced Scalability: Handle more read traffic with ease. Add additional replicas like building blocks to evenly distribute read workload and meet ever-growing demands.
  • Improved Performance: Enjoy significantly reduced response times for read-heavy applications like reporting, analytics, and dashboards. Queries fly without impacting write performance on the primary instance.
  • Increased High Availability: If your primary goes down, a designated replica can seamlessly take over, minimizing downtime and ensuring data availability.
  • Cost Efficiency: Pay only for the replica’s storage and compute resources, not for additional read transactions on the primary. Optimize costs while scaling performance.

Who Needs Read-Only Replicas?

  • Applications with high read-to-write ratios: Reporting, analytics, data warehouses, dashboards, and web applications with heavy read traffic.
  • Businesses experiencing scaling challenges: Struggling to meet read demand with your current configuration? Replicas help you scale efficiently and keep up.
  • Organizations demanding high availability: Eliminate single points of failure and ensure continuous access to data even during primary instance outages.

Getting Started with Read-Only Replicas:

Setting up your read-only replicas is effortless. Azure SQL Managed Instance offers built-in replicas in the business-critical tier, accessible through a simple ApplicationIntent=ReadOnly flag in your connection string. And if you need even more flexibility, you can configure additional replicas with granular control over location and storage options.

Remember: Replicas are for reading only! Any write attempts will be redirected to the primary instance. However, you can leverage tools like database triggers to replicate data modifications back to the primary, keeping both instances in sync.

Unleash the Power of Read-Only Replicas:

By embracing read-only replicas, you’ll experience a revolution in your Azure SQL Managed Instance performance. Faster reads, smoother scaling, and enhanced high availability – all while optimizing costs. So, ditch the bottleneck and empower your database with the agility and resilience of read-only replicas. Your applications and users will thank you!

Ready to dive deeper? Check out these resources for more information:

Don’t let heavy read loads hold you back. Unleash the full potential of your Azure SQL Managed Instance with the power of read-only replicas!

Synchronous vs. Asynchronous: Demystifying Always On Availability Group Modes


Always On Availability Groups in SQL Server offer high availability and disaster recovery solutions. But under the hood, two distinct data synchronization methods dictate how changes flow between the primary and secondary replicas: synchronous and asynchronous. Choosing the right one hinges on your specific needs and performance priorities.

Synchronous Replicas: Dancing To the Same Beat

Imagine a synchronized dance troupe. Synchronous replicas mirror the primary replica’s actions step-by-step. Every transaction committed on the primary waits for confirmation from a quorum of synchronous replicas before finalizing. This ensures data consistency across all replicas. However, this wait introduces latency, potentially impacting user experience, especially on geographically dispersed replicas.

Benefits:

  • Guaranteed data consistency: No risk of data divergence between replicas.
  • Fast failover: Minimal data loss on failover due to close synchronization.
  • Read-scale availability: Secondary replicas can serve read-only workloads, reducing pressure on the primary.

Drawbacks:

  • Higher latency: Waiting for synchronous commits can impact performance.
  • Limited scalability: Large numbers of synchronous replicas can burden the primary.
  • Single point of failure: If the primary or quorum replica fails, transactions stall.

Asynchronous Replicas: Flying Solo with Lag

Asynchronous replicas, like solo dancers, apply changes independently. Transactions committed on the primary are queued and sent asynchronously to the secondary. This eliminates waiting, improving performance and scalability. However, a data lag can exist between replicas, creating potential inconsistencies.

Benefits:

  • Lower latency: Transactions commit faster without synchronous waits.
  • Higher scalability: Can handle more secondary replicas without impacting performance.
  • Resilience to failures: Asynchronous replicas continue receiving data even if the primary fails.

Drawbacks:

  • Potential data inconsistencies: Data lag can occur, leading to temporary data differences.
  • Slower failover: More data might need to be replayed on failover, increasing downtime.
  • Limited read-scale availability: Asynchronous replicas might not be fully synchronized for read workloads.

Choosing the Right Rhythm:

The choice between synchronous and asynchronous boils down to your tolerance for latency vs. data consistency:

  • Synchronous: Choose for mission-critical applications requiring constant data consistency and fast failover, even at the cost of performance.
  • Asynchronous: Choose for performance-sensitive applications where read-scale availability and scalability are crucial, despite a potential data lag.

A Final Note:

Always On Availability Groups offer flexibility with hybrid configurations. You can mix synchronous and asynchronous replicas within the same group based on specific needs. Remember to carefully evaluate your workload characteristics and desired recovery times before settling on a rhythm for your availability group.

Stay tuned for further articles delving deeper into specific aspects of Always On Availability Groups!

Check when was last updatestats was done by object name

I was told to find out the last time an update stats was done on an object.

I used this query to show me StatisticsUpdateDate which is the last date/time updatestats was run on the object

--UpdateStats last update
SELECT 
    OBJECT_NAME(object_id) AS [ObjectName],
    [name] AS [StatisticName],
    STATS_DATE([object_id], 
    [stats_id]) AS [StatisticUpdateDate]
FROM 
    sys.stats;

Hope this helps.

SQL Server Replication Explained


SQL Server Replication: Keeping Your Data in Sync

Imagine having multiple databases scattered across your organization, each containing crucial information but living in blissful isolation. What if changes made in one database need to be reflected in the others, ensuring everyone operates with the same up-to-date data? Enter SQL Server replication, a powerful tool that bridges the gap between isolated databases, keeping them synchronized and consistent.

Think of replication as a synchronized dance between databases. One database, the publisher, takes the lead, making changes to its data. The distributor, acting as the choreographer, receives these changes and distributes them to the subscribers, the eager followers who update their own data accordingly. This constant communication ensures everyone is on the same page, avoiding data inconsistencies and outdated information.

Why Use SQL Server Replication?

The benefits of keeping your databases in sync are numerous:

  • Enhanced Data Availability: Replication ensures geographically dispersed users access the latest data, regardless of their location.
  • Improved Disaster Recovery: In case of server failure, subscribers continue functioning with replicated data, minimizing downtime.
  • Scalability and Performance: Offload read-intensive workloads to subscribers, reducing pressure on the publisher and improving overall performance.
  • Data Sharing and Collaboration: Facilitate collaboration between departments by keeping relevant data synchronized across different databases.

Types of Replication in SQL Server

SQL Server offers different types of replication to cater to diverse needs:

  • Transactional Replication: Continuously copies changes from the publisher to subscribers, ensuring near real-time data synchronization.
  • Merge Replication: Allows for bi-directional data exchange, enabling updates made on subscribers to be propagated back to the publisher.
  • Snapshot Replication: Provides an initial copy of the publisher’s data to subscribers, followed by periodic updates to keep them in sync.

Setting Up Replication:

Implementing replication involves configuring the publisher, distributor, and subscribers, defining which data to replicate, and scheduling synchronization intervals. While it may seem complex initially, SQL Server provides intuitive tools and wizards to guide you through the process.

Beyond the Basics:

SQL Server replication goes beyond simple data synchronization. Advanced features like:

  • Filtering and transformation: Tailor what data gets replicated and how it’s transformed during the process.
  • Conflict resolution: Define how conflicting changes from different subscribers are handled.
  • Security: Implement robust security measures to protect replicated data from unauthorized access.

Unlocking the Power of Synchronization:

SQL Server replication is a valuable tool for organizations with geographically distributed databases or those requiring enhanced data availability and disaster recovery capabilities. By understanding its different types, functionalities, and configuration options, you can leverage its power to keep your data synchronized, consistent, and accessible, ultimately empowering informed decision-making across your organization.

So, the next time you find yourself juggling multiple databases, remember SQL Server replication – the conductor of your data symphony, ensuring everyone plays in perfect harmony.

I hope this article has provided a comprehensive overview of SQL Server replication.

What is SQL Server Query Store

SQL Server Query Store: A Deep Dive into Performance Insights

Imagine peering into the inner workings of your SQL Server, witnessing the intricate dance of queries and their performance in real-time. That’s the power of the SQL Server Query Store, a built-in tool that sheds light on your database’s health and efficiency.

Unveiling the Mystery of Query Execution

Gone are the days of blind troubleshooting, chasing slow queries like phantoms in the SQL Server labyrinth. The Query Store acts as a performance detective, capturing a wealth of data about every query executed:

  • Query text and plan: See exactly what queries are running and how they’re planned for execution.
  • Runtime statistics: Track execution time, CPU usage, I/O operations, and other metrics to pinpoint bottlenecks.
  • Wait statistics: Identify locks, blocking sessions, and other resource contentions impacting performance.
  • Historical trends: Analyze query performance over time and spot anomalies or regressions.

With this treasure trove of information at your fingertips, you can:

  • Identify and optimize slow queries: Quickly pinpoint the culprits behind sluggish performance and tune them for efficiency.
  • Prevent performance regressions: Track changes in query plans and performance metrics to proactively address potential issues.
  • Understand workload patterns: Gain insights into how your database is used at different times, enabling better resource allocation.
  • Troubleshoot query-related issues: Diagnose specific problems like locking or blocking with detailed execution data.

Beyond Basic Monitoring: A Toolbox for Optimization

The Query Store isn’t just a passive observer; it’s an active participant in your performance optimization journey. Here’s how:

  • Query plan forcing: Fix inefficient query plans by forcing a specific plan to be used, ensuring consistent performance.
  • Query history cleanup: Manage storage space by defining policies for automatically archiving or deleting historical data.
  • Alerts and notifications: Set up alerts for specific performance thresholds, like slow query execution or resource exhaustion, to be notified proactively.

Unlocking the Potential of Your SQL Server

The SQL Server Query Store is a powerful tool, but its effectiveness hinges on proper utilization. Here are some tips to make the most of it:

  • Enable Query Store on all databases: Don’t miss out on valuable insights; activate it across your entire SQL Server environment.
  • Define capture policy: Tailor data collection based on your needs, balancing performance impact with data granularity.
  • Leverage built-in analysis tools: SQL Server Management Studio offers intuitive dashboards and reports to visualize and analyze query store data.
  • Integrate with third-party tools: Extend your analysis capabilities with specialized tools for deeper performance insights and optimization recommendations.

Empowering Informed Decisions with Data-Driven Insights

The SQL Server Query Store empowers you to move beyond guesswork and intuition in managing your database. By harnessing its data-driven insights, you can optimize performance, prevent issues, and ensure your SQL Server runs like a well-oiled machine. So, dive into the Query Store, unveil the secrets of your queries, and unlock the full potential of your SQL Server environment.

Remember, the Query Store is just one piece of the puzzle. Combining it with other performance monitoring tools and best practices can create a comprehensive strategy for maintaining a healthy and efficient SQL Server infrastructure.

We can enable Query Store thru the SSMS GUI or by SQL Script.
Here is the SQL Script to turn on Query Store for a database:

ALTER DATABASE <DatabaseName> SET QUERY_STORE = ON;

SQL Server adds default values for Query Store parameters. In SQL Server 2019 Microsoft decided to change default values for some of them. Here is the table with the attributes and former and current default values:

AttributeSQL Server 2017SQL Server 2019
max_storage_size_mb100500
interval_length_minutes6015
query_capture_modeALLAUTO
flush_interval_seconds9003,000
max_plans_per_query2001,000

I hope this article has given you a deeper understanding of the SQL Server Query Store and its potential to revolutionize your database performance management.

Traditional Artificial Intelligence

Traditional AI Shapes Our World, One Algorithm at a Time

While headlines trumpet the rise of “superintelligent” AI, a quieter revolution is already underway. Traditional AI, also known as “narrow” or “weak” AI, may not grab the same attention as its flashier cousin, but it’s deeply woven into the fabric of our daily lives, influencing everything from the coffee you order in the morning to the routes your GPS navigates.

Unlike the grand ambitions of general AI, which aims for human-level sentience, traditional AI excels at performing specific tasks with superhuman efficiency and precision. Think of it as a master of narrow domains, a virtuoso chess player or a lightning-fast medical scanner, unmatched in its chosen field.

Here’s where you’ll find traditional AI making a difference:

  • Behind the scenes of your digital world: Search engines like Google utilize AI algorithms to sift through mountains of data, delivering the most relevant results in milliseconds. Recommendation engines on platforms like Netflix or Amazon predict your preferences with uncanny accuracy, suggesting movies or products you’ll love.
  • Optimizing our everyday routines: Traffic lights adapt to congestion patterns, thanks to AI that predicts traffic flow and minimizes bottlenecks. Fraud detection systems employ AI to sniff out suspicious activity, safeguarding your financial transactions. Even your smartphone camera uses AI to automatically adjust focus and lighting, capturing flawless photographs.
  • Fueling scientific and medical breakthroughs: Drug discovery is accelerated with AI tools that analyze vast datasets of molecules, predicting promising candidates for further research. AI-powered medical imaging aids in early disease detection and personalized treatment plans.

Traditional AI might not chat like a friend or write poetry like a Shakespeare, but its impact on our lives is undeniable. It makes our world more efficient, convenient, and safe, often working silently behind the scenes, without fanfare.

However, it’s not without its challenges. Concerns loom around bias in algorithms, leading to discriminatory outcomes. We must constantly strive to develop and deploy AI responsibly, ensuring fairness and transparency in its applications.

As we move forward, traditional AI will continue to evolve, becoming even more adept at what it does best. But its value lies not in replacing human intelligence, but in complementing it, amplifying our capabilities, and paving the way for a future where humans and machines work together to create a better world.

Remember, the next time you marvel at a self-driving car or marvel at the precision of a medical diagnosis, you’re witnessing the quiet power of traditional AI.

Further exploration: This article could be expanded to include specific examples of traditional AI applications in different sectors, like healthcare, finance, or transportation. You could also discuss the ethical challenges and biases inherent in AI and potential solutions for mitigating them. The choice is yours!

What is Generative AI

Artificial intelligence has long captivated our imaginations. From chess-playing computers to self-driving cars, AI has showcased its prowess in mimicking and surpassing human capabilities. But a new frontier is emerging: generative AI, where machines leap from analysis to artistry, not just replicating, but creating entirely new content.

This isn’t your grandfather’s AI, churning out repetitive formulas. Generative AI delves into the heart of human creativity, weaving words into captivating stories, sculpting pixels into breathtaking landscapes, and even composing symphonies that tug at our emotions. It learns the intricate dance of patterns and probabilities within vast datasets, then uses this knowledge to extrapolate, invent, and surprise.

Imagine a world where:

  • Writers’ block becomes a thing of the past, with AI partners conjuring fresh plot twists and vibrant characters.
  • Musicians collaborate with digital orchestras, their melodies merging with AI-generated harmonies to create unheard-of soundscapes.
  • Fashion designers unveil garments spun from algorithms, each piece a unique expression of AI’s artistic vision.
  • Architects conjure futuristic cities, shaped not by blueprints but by generative AI’s understanding of space, light, and human needs.

The applications stretch far beyond the realm of artistic expression. Scientists can leverage generative AI to synthesize new materials with revolutionary properties, engineers can prototype complex structures in virtual reality, and doctors can personalize medical treatments based on AI-generated simulations.

But with such tremendous power comes responsibility. Generative AI raises ethical concerns around deepfakes, misinformation, and potential biases. We must ensure these tools are used for good, fostering responsible development and deployment, and prioritizing human values in the AI creation process.

The rise of generative AI marks a pivotal moment in our relationship with technology. It’s not about replacing human creativity; it’s about expanding its boundaries, opening doors to possibilities we can only begin to imagine. As we step into this brave new world, we must do so with open minds, critical thinking, and a commitment to harnessing the power of generative AI for a brighter future.

This is just the beginning of the story. What will you create with generative AI?

Create SQL Script to update stats for all tables in a database with Full Scan

I was asked to run update stats on all the tables in a database with full scan.
And I was to use native SQL code not OLA or another open source.
Additional requirement was to update stats on each table individually listed as a separate line of code.

The database had multiple tables so to list each individual items was going to very burdensome.

Thankfully I found this sql script that will generate an update stat statement for each table in the database. You just have to copy the query output to either a new query window or just put it in a sql job and run it.

SET NOCOUNT ON
GO
 
DECLARE updatestats CURSOR FOR
SELECT table_schema, table_name
FROM information_schema.tables
where TABLE_TYPE = 'BASE TABLE'
OPEN updatestats
 
DECLARE @tableSchema NVARCHAR(128)
DECLARE @tableName NVARCHAR(128)
DECLARE @Statement NVARCHAR(300)
 
FETCH NEXT FROM updatestats INTO @tableSchema, @tableName
 
WHILE (@@FETCH_STATUS = 0)
BEGIN
  SET @Statement = 'UPDATE STATISTICS '  + '[' + @tableSchema + ']' + '.' + '[' + @tableName + ']' + ' WITH FULLSCAN'
  PRINT @Statement -- comment this line out if you want to actually run the query instead of just getting the code for the task.
 --EXEC sp_executesql @Statement -- remove the comment to run the command 
  FETCH NEXT FROM updatestats INTO @tableSchema, @tableName
END
 
CLOSE updatestats
DEALLOCATE updatestats
GO
SET NOCOUNT 

Hope this helps, enjoy!