Feeds:
Posts
Comments

As technology continues to evolve, the role of a Database Administrator (DBA) has also evolved. A DBA is responsible for managing and maintaining databases, ensuring data integrity, and ensuring database availability. With the increasing amount of data that organizations manage, DBAs are in high demand. However, many of the tasks they perform can be automated. In this post, we will discuss 5 tasks that DBAs spend most of their time on, but can be automated.

  1. Backup and Recovery: One of the primary responsibilities of a DBA is to ensure that data is backed up and can be recovered in the event of a disaster. However, with modern database management systems, backup and recovery processes can be automated. This can save DBAs a significant amount of time, allowing them to focus on other tasks.
  2. Monitoring: DBAs spend a lot of time monitoring databases to ensure that they are running smoothly. This includes monitoring performance, identifying potential issues, and making adjustments as needed. However, with the use of advanced monitoring tools, many of these tasks can be automated. Automated monitoring tools can detect issues and make adjustments in real-time, freeing up DBAs to focus on other critical tasks.
  3. Performance Tuning: Performance tuning is another critical task that DBAs spend a lot of time on. They need to analyze queries, identify bottlenecks, and make adjustments to improve performance. However, modern database management systems come with built-in performance tuning capabilities that can be automated. Automated performance tuning tools can analyze queries, identify bottlenecks, and make recommendations for improvements.
  4. Patching and Upgrades: DBAs spend a lot of time patching and upgrading databases to ensure that they are secure and up-to-date. However, patching and upgrades can be automated. Automated patching and upgrade tools can identify vulnerabilities and install updates automatically, saving DBAs a significant amount of time.
  5. Capacity Planning: DBAs spend a lot of time planning and managing database capacity to ensure that it meets the needs of the organization. However, with modern database management systems, capacity planning can be automated. Automated capacity planning tools can analyze database usage patterns and make recommendations for capacity planning, allowing DBAs to focus on other tasks.

While DBAs play a critical role in managing and maintaining databases, many of their tasks can be automated. Automated tools can save DBAs a significant amount of time, allowing them to focus on other critical tasks such as data modeling, data architecture, and database design. Organizations should explore the use of automation tools to improve efficiency and free up DBAs to focus on strategic initiatives that will drive business growth.

Idea is not to replace DBA jobs, but make it more effective and Valuable!!

Cloud computing is one of the fastest-growing areas of IT, and obtaining a cloud certification can help you stand out in a competitive job market. In this article, we will discuss the top 10 cloud certifications for a high performing IT career in 2023.

  1. AWS Certified Solutions Architect – Associate This certification is designed for individuals who want to demonstrate their expertise in designing and deploying scalable, highly available, and fault-tolerant systems on AWS. It is a highly sought-after certification and can help you stand out in a competitive job market.
  2. Google Cloud Certified – Professional Cloud Architect This certification demonstrates your expertise in designing, developing, and managing solutions on Google Cloud Platform. It is a highly respected certification and can help you advance your career in cloud computing.
  3. Microsoft Certified: Azure Solutions Architect Expert This certification validates your expertise in designing and implementing solutions on Microsoft Azure. It is a highly sought-after certification and can help you advance your career in cloud computing.
  4. CompTIA Cloud+ This certification is designed for IT professionals who want to demonstrate their expertise in cloud computing. It covers cloud computing concepts, virtualization, and infrastructure management, and can help you stand out in a competitive job market.
  5. Salesforce Certified Technical Architect: This certification is designed for individuals who want to demonstrate their expertise in designing and developing solutions on the Salesforce platform. It is a highly respected certification and can help you advance your career in cloud computing.
  6. IBM Certified Solution Advisor: Cloud Computing Architecture V3 This certification is designed for IT professionals who want to demonstrate their expertise in cloud computing architecture. It covers cloud computing concepts, design principles, and best practices, and can help you advance your career in cloud computing.
  7. Red Hat Certified System Administrator in Red Hat OpenStack This certification is designed for IT professionals who want to demonstrate their expertise in Red Hat OpenStack cloud computing. It covers installation, configuration, and management of OpenStack, and can help you stand out in a competitive job market.
  8. VMware Certified Professional – Cloud Management and Automation This certification validates your expertise in deploying and managing VMware cloud infrastructure. It covers cloud automation, cloud operations, and cloud management, and can help you advance your career in cloud computing.
  9. Docker Certified Associate This certification validates your expertise in designing, deploying, and managing Docker containers. It is a highly respected certification and can help you stand out in a competitive job market.
  10. Oracle Cloud Infrastructure Certified Architect Associate This certification is designed for IT professionals who want to demonstrate their expertise in designing and implementing solutions on Oracle Cloud Infrastructure. It covers cloud computing concepts, design principles, and best practices, and can help you advance your career in cloud computing.

In conclusion, cloud certifications are an excellent way to demonstrate your expertise in cloud computing and advance your career. The certifications listed above are among the most respected and sought-after in the industry and can help you stand out in a competitive job market.

As we enter 2023, the database technology landscape continues to evolve rapidly. With new trends and advancements, it is essential to stay up-to-date on the latest database technologies to remain competitive in today’s data-driven world. In this article, we will discuss the top database technology trends for 2023.

  1. Cloud Databases: Cloud databases have been gaining popularity for some time now, and their use is expected to continue to increase in 2023. Cloud databases are hosted on a cloud computing platform, allowing users to access and manage their data from anywhere in the world. The benefits of cloud databases include cost savings, scalability, and ease of use.
  2. NoSQL Databases: NoSQL databases are designed to handle unstructured data, making them ideal for applications that require high scalability and flexibility. They are gaining popularity due to their ability to handle large amounts of data efficiently, as well as their ability to scale horizontally.
  3. Graph databases: are designed to handle complex relationships between data points, making them ideal for applications such as social networks and recommendation systems. With the rise of machine learning and AI, graph databases are expected to become even more popular in 2023.
  4. Blockchain Databases: Blockchain databases are decentralized, distributed databases that are designed to be immutable and secure. They are ideal for applications that require high security and transparency, such as supply chain management and financial transactions. In 2023, we expect to see an increase in the adoption of blockchain databases, as more companies seek to improve their data security.
  5. In-Memory Databases: In-memory databases are designed to store data in RAM, allowing for fast and efficient access to data. They are ideal for applications that require high performance and low latency, such as real-time analytics and high-frequency trading. In 2023, we expect to see an increase in the adoption of in-memory databases, as more companies seek to improve their real-time data processing capabilities.
  6. Hybrid Databases: Hybrid databases combine the benefits of both traditional and modern database technologies. They are designed to handle both structured and unstructured data efficiently, allowing users to extract insights from a variety of data sources. In 2023, we expect to see an increase in the adoption of hybrid databases, as more companies seek to combine their existing data infrastructure with modern database technologies.

In conclusion, database technology trends are constantly evolving, and staying up-to-date on the latest advancements is essential for remaining competitive in today’s data-driven world. In 2023, we expect to see an increase in the adoption of cloud databases, NoSQL databases, graph databases, blockchain databases, in-memory databases, and hybrid databases. Companies that embrace these technologies will be better positioned to extract insights from their data, improve their operational efficiency, and gain a competitive advantage in their respective industries.

How often do we see wrong use of to_char on date fields and causing huge performance impact?

Just saw a query with date field in where condition with to_char conversion with compared to a date value in string. It was a billion rows table and a simple count of rows was taking 3 hours to complete. Simply removing to_char from the date field and putting to_date on the date string like to_date(‘01-Jan-2019’,’DD-Mon-YYYY’) eliminated all the conversions on billion rows and query returned rows in .89 Seconds.

#QueryTuning #OracleTuning

There was an Incident reported today where suddenly one of the job started working slow. Actually, it was working slow since couple of weeks now and since the final output of the job was to download a file, initial focus was to troubleshoot on the application side of the things to see if something has changed or to find out any potential errors/warnings on application side of the things.

Eventually, the request came to DBA’s where I saw there are queries which are running longer than what they were before.

Execution plan of the long running query clearly shows the problem

exec_bad

So all the cost is coming from Step 7 which is Full table scan of a table.

Table was missing the required Index on the column being used in Where clause and this particular condition was selecting 1 row out of 900k rows.

Putting up an index reduced the cost of whole operation from 48K to 8.

exec_good

And Number of buffer gets from 188768 per Execution to just 11 per execution.

Now the question is why this thing was working fine until few weeks back and why it started working slow now?

Most probably, the table in question here “ECO_DOWNLOAD_INITIAL_MSG” has grown over time and caused this performance issue.

Important thing to note here is “I said most Probably” but I am not 100% sure.

So, to change this “most Probably” thing to a certain thing, its necessary that we not only capture the database growth which normally you get easily with Oracle Enterprise Manager repository, But we should also capture the object level growth for our databases.

Answers to situations like these can easily be made if we have correct numbers on how my database objects are growing.

There can be number of ways to achieve this. One simple way that I use is to have a custom schema in my database and run an automated job to capture Size of the table and other objects from DBA Segments and dump that into a table in my custom schema.

There can be better ways of doing this thing. Important is to have one of the approach to save this information to analyze the growth of not only your database but also database objects.

How often do you see this error in your database alert log file when you have “db_recovery_file_dest_size” parameter configured and Archive log destination is set to use Recovery File Destination. There is no harm or problem in this approach as long as you are following all the right practices of cleaning up the FRA/Recovery destination.

There is a common scenario that where this destination fills up to the Top or you have reached the size as specified by “db_recovery_file_dest_size” parameter.

Issue:

ORA-19815: WARNING: db_recovery_file_dest_size of xxxxx bytes is 100.00% used, and has 0 remaining bytes available.

Cause:

  1. Recovery Destination is full
  2. Recovery Destination has reached the upper limit as defined by Size parameter.
  3. (Most probably): files in the Recovery Destination have been removed manually and database metadata is not updated because of this.

Solution:

  1. For Issues 1 and 2 where in  Recovery Destination is full: Clean the  destination using proper method like Using Rman, backup and delete the Archive logs.
  2.  Then there is a scenario that as per Database Views/Metadata, Recovery Destination is full but when you check on the OS/storage level, there is plenty of space available. This means there has been some clean up performed in past which was not according to the correct procedure. Lets see this

Recovery Destination Parameters Configuration

Size_parameter

On Querying V$Flash_Recovery_area_usage, we know that Recovery destination has been fully consumed.

select * from v$flash_recovery_area_usage

screen3

So As per this, FRA is almost 100% full.

When we check the FRA destination for available space and utilized space, It was not the case


Free_space

So FRA is approximately half utilized.

Question: Why Database Server is reporting it otherwise?

Answer is, someone manually deleted the Archive logs from the Archive Destination and Database has no automatic way to diagnose that. And as per Database dictionary views, Destination should have been filled completely going by the number and size of Archive logs generated till now since last ethical Archive Log removal.

So the solution to this problem is to Update Database Metadata/repository with the correct and current actual information.

We can do that with RMAN using 2 commands, Crosscheck and Delete.

First, Run Crosscheck command for Archivelogs

crosscheck archivelog all;

Second, Delete the Expired Archive Logs

delete expired archivelog all;

Once, both of the above commands are completed, check the “v$flash_recovery_area_usage” to confirm that correct information is being reflected for used space for Archive logs.

Gree_space

 

Issue/Problem:

Pluggable database opens in Restricted Session Mode after being created from an XML file/image.

User logins receive following Error

ORA-01035: ORACLE only available to users with RESTRICTED SESSION privilege

Cause:

Pluggable database is created/Re-build from existing XML file (Which might have been used in past without an issue). Creation of PDB ran without an issue except at the end and in Alert log file, there is a message recorded as “Completed with errors”.

To see the exact cause for this error(s), we can check the Dictionary View “pdb_plug_in_violations”.

select

            status, message, action

     from  

            pdb_plug_in_violations

     where

            status !=’RESOLVED’;

In our case, the error was

Database option SDO mismatch: PDB installed version 12.1.0.2.0. CDB installed version NULL.

The issue was Oracle Spatial (SDO). This particular database feature was removed from Root Container (CDB) and PDB being created from the XML file was still having Oracle Spatial feature enabled.

In Oracle Container Database Architecture, All the PDB’s have to be Subset of what CDB is which means, PDB’s can only have those database features enabled, which are enabled in Root container.

So in this case, Oracle Spatial was removed sometime back from CDB but PDB’s source XML files still have that feature.

So, you are still able to create the PDB but when you open the PDB for Read Write, Oracle Database Server automatically puts this instance of PDB in “Restricted Session” Mode.

scree1

Solution:

  1. Recreate PDB Image (XML File) from the source after removing Spatial Feature in the source.
  2. (If Solution 1 is not feasible for you) – Open the new PDB in Read Write Restricted Mode (By default it will open in this because of error). Remove the Spatial Feature from PDB using below steps.
    1. Connect to the PDB Database which in this case is “MS_PDB_ODL08’.
    2. Execute the script “$ORACLE_HOME/md/admin/mddins.sql

    3. Close the PDB : Alter pluggable database MS_PDB_ODL8 close;
    4. Open the PDB : Alter Pluggable database MS_PDB_ODL08 open;
  3. Check the status of PDB

screen2

AIOUG’s 1st regional Chapter, North India Chapter is bringing you the 1st Tech Meet(Tech Day) of 2018 in NCR.

RAC Internals & Machine Learning – By Sandesh Rao

Sandesh Rao is a VP running the RAC Assurance Team within RAC Development at Oracle Corporation specializing in performance tuning , high availability , disaster recovery and architecting cloud based solutions using the Oracle Stack. With more than 14 years of experience working in the HA space and having worked on several versions of Oracle with different application stacks he is a recognized expert in RAC , Database Internals , PaaS , SaaS and IaaS solutions , solving Big Data related problems . Most of his work involves working with customers in the implementation of public and hybrid cloud projects in the financial , retailing , scientific , insurance , biotech and the tech space. His current position involves running a team that develops best practices for the Oracle Grid Infrastructure 12c including products like RAC (Real Application Clusters) , Storage (ASM , ACFS)More details at http://bit.ly/1UCL46K

All the above written summary defines 1 part of Sandesh. There is second side of Sandesh as well. Apart from all those highly complex technical stuff that he own, He is a great speaker, when I say this, I mean he does not need a formal Presentation,  Any formal Stage, or Rather platform but what he need is someone may be owning 1% of enthusiasm that he himself owns, and he actually ability to convert that 1% to atleast 50% if not 100, by the time you are done listening to him.

He is an Ocean of knowledge. Give him the topic and he can do a deep dive into that. He ability to make you feel at ease and comfortable is unparalleled.

What best, North India Chapter is coming up with a Full day event exclusive by Sandesh Rao, The Legend Himself.

If you have not yet registered for the event, do not waste further time and book your slot. We have a very limit seats available.

We are looking forward to meet you everyone on the day.

Cheer!!!

Team North India Chapter.

Link to Event: https://www.meraevents.com/event/aioug-nic-tech-day-feb2018

Link to our Facebook Page: https://www.facebook.com/AIOUG.NIC/

SQL Developer is a powerful tool for both Developers and DBA’s. With every new version and release, Oracle is making sure that SQL Developer becomes more and more powerful that can bridge gap between DBA and a Developer. It is no longer just a Developer’s Friend. Its horizon has increased many folds and I would encourage everyone to start using it. Its easy. Its Free. #Devops

Please find my presentation on Slideshare. Reach out to me in case you want to discuss anything on this or you have any queestions.

 

AIOUG and North India chapter concluded yet another event in North India and this time it was AIOUG’s Yearly 1 day technical Event OTN Yathra. It was held at Fidelity Gurgaon this time on 25th June 2017, Sunday. As North India Chapter Team, We truly thanks everyone of you who spared out time on all precious Sunday and attended the event.

It was for the first time that people of NCR region got to hear Sandesh Rao (Senior Director, RAC Assurance Team Oracle) who presented 2 hours masterclass on Troubleshooting RAC and Connor McDonald, a well established Name in Oracle World who came all the way from Australia and presented 2 hours talk on Oracle 12c New Features for DBA’s and Developers.

Apart from this, there were other Speakers, speaking on variety of Topics from Oracle Security, Exadata, Golden Gate, OEM, Quest’s Toad and many more.

A big thanks to all the Speakers, attendees and Volunteers who actually made this event happen and a Successful one!!

Cheers from Team AIOUG and North India Chapter.

You Can download Pictures of the event and all the presentations from following link

http://otnyathra.in/downloads/

Link to Photos: https://photos.google.com/share/AF1QipPTjKO49oc3Vx7_X5XxuNpFhBMnpkmVdIXD_PpVwBTQpweIBvqXN-RxZTV8MGIbHA?key=SXA4QTc3X092Tk1NSFZjZC1oNlVJUF95Y1NZamhR

 

 

For more information about North India Chapter and Future Events

Like Us on our Facebook Page :  https://www.facebook.com/AIOUG.NIC/

Real Life Database / SQL Experiences : An Oracle Blog from Vivek Sharma

Being an Oracle Professional, I like to share all my Real Life Performance Tuning Challenges and Experiences. The Content and Views on this site are my own and not necessarily those of Oracle. While, I write on my real life experiences, the resolutions mentioned are solely mine. Comments / Criticisms are always a welcome.

Oracle database internals by Riyaj

Discussions about Oracle performance tuning, RAC, Oracle internal & E-business suite.

The Arup Nanda Blog

There is no ending to learning....

Summersky RAC Notebook

Oracle RAC and ASM related.....from the desk of Murali Vallath

Simple Talk

There is no ending to learning....

Oracle related stuff

There is no ending to learning....

Richard Foote's Oracle Blog

Focusing Specifically On Oracle Indexes, Database Administration and Some Great Music

Oracle Scratchpad

Just another Oracle weblog

Oracle Blogs | Oracle Blogs

There is no ending to learning....