pscipher, psvault and Web Server Passwords

Encrypting passwords is a common activity for PeopleSoft administrators. For many of us, we take for granted that passwords are encrypted by the system. In this post, I want to look at password encryption for web server configuration files and how that works.

Encrypting with pscipher

pscipher is a PeopleTools utility that encrypts passwords using 3DES encryption. The utility is a wrapper for a Java class that handles the encryption and decryption of passwords. If you look at the passwords stored in the configuration.properties file, or produced by pscipher, they look something like this: {V1.1}IsZtCVg15Ls=

To encrypt a password with pscipher:

  1. Navigate to your web server’s [PS_CFG_HOME]\webserver\[domain]\piabin folder.
  2. Run .\pscipher.bat [password]

pscipher will return the encrypted output:

.\PSCipher.bat password

Your environment has been set.
Encrypted text: {V1.1}7m4OtVwXFNyLc1j6pZG69Q==

You can copy/paste the encrypted text into your web server config files. For example, below is the pskey configuration in the integrationGateway.properties file using an encrypted password:

secureFileKeystorePath=e:/psft/cfg/LM014-8.55.09/webserv/peoplesoft1/piaconfig/keystore/pskey
secureFileKeystorePasswd={V1.1}7m4OtVwXFNyLc1j6pZG69Q==

You can also encrypt passwords with pscipher through the PIA too. Navigate to PeopleTools > IB > Configuration > Gateways. Select your a gateway and click “Gateway Setup Properties”. After you log into the gateway, select the link “Advanced Properties Page”. This page lets you modify the integrationGateway.properties file directly, but it also has a Password Encryption section. This Password Encryption tool calls pscipher on your application server to encrypt passwords.

encryptPasswordPIA

Building New Keys

The {V1.1} at the beginning of the password denotes which key pscipher uses. 1.1 means your passwords are using the default key. I highly recommend you change the key. To create a new key, run the command pscipher -buildkey. A new key will be appended to the file psvault. The pscipher command will now generate {V1.2} passwords. Appended is important here. This means that you can still use {V1.1} encrypted passwords in your configuration files and the newer {V1.2} encrypted passwords.

psvault Locations

The psvault file is stored under [PS_CFG_HOME]\webserv\[domain]\piaconfig\properties\psvault. When you run -buildkey, that is the file pscipher updates. After you update the key to {V1.2}, you need to copy the updated psvault file to any other web and app servers that you want to decrypt the new passwords.

  • For web servers, copy the updated psvault file to [PS_CFG_HOME\webserv\[domain]\piaconfig\properties\psvault.
  • For app servers, copy file to [PS_HOME]\secvault\psvault.

You should copy the updated psvault file to your app servers. When you update your integrationGateway.properties file online (PeopleTools > IB > Configuration > Gateways), any passwords you encrypt using the online pages are encrypted with the app server’s copy of psvault.

So far, I haven’t been able to get the Tuxedo domains to recognize the psvault file under PS_CFG_HOME, and a Oracle confirmed with me that PS_CFG_HOME is not supported with psvault and the app server. If you are using decoupled homes, this breaks some of the benefits of having a separate and shared PS_HOME codeline. I created an Idea on the Oracle Community site to add support for PS_CFG_HOME and psvault; go vote for the idea if would like decopupled home support too.

#65 – CPU Patching

This week on the podcast, Dan and Kyle talk about using web traffic data to analyze user activity, new information on Jolt Failover, and how we generate and distribute compare reports. Then, they discuss Critical Patch Updates and how they affect PeopleSoft Administrators.

Show Notes

#64 – Testing Oracle Cloud w/ Sasank Vemana

This week on the podcast, Sasank Vemana returns to talk about his experience testing Oracle Cloud and using Google Analytics to analyze web traffic. We also discuss the differences in supporting students as end-users.

Show Notes

Refreshes with Data Guard and Pluggable Databases

A PeopleSoft refresh is one of the more common tasks for a PeopleSoft Administrator or DBA. There are many ways to accomplish this task, but it usually involves a database restore from backup, custom SQL refresh scripts and potentially ACM steps. Depending on the level of effort put into the refresh scripts, there can also be manual steps involved. This approach is tried and true, but tends to lack the speed and flexibility that we are starting expect with the delivery of the PeopleSoft Cloud Architecture toolset. Nightly or Ad-hoc refresh environments and quickly provisioned temporary environments are just a few use cases that would benefit greatly from refresh process improvements. I have been doing some exploring in this area recently and would like to share a few thoughts. First, a quick overview of some Oracle tools and features that I have been leveraging.

Data Guard

Oracle Data Guard is a tool that gives you high availability, data protection and disaster recovery for your databases. At a high level, it consists of a primary database and one or more standby databases. These standby databases are transactionally consistent copies of the primary database. Therefore, if the primary database goes down, the standby can be switched to primary and your application can keep on rolling.

Physical vs. Snapshot Standby

There are multiple types of standby databases that can be used with Data Guard. I’d like to briefly explain the difference between Physical Standby and Snapshot Standby. A physical standby is a database that is kept in sync with a primary database via Redo Apply. The redo data is shipped from the primary and then applied to the physical standby. A snapshot standby is basically a physical standby that was converted to a snapshot, which is like a point in time clone of the primary. At this point we can use the snapshot to do development, testing, etc. When we are done with our snapshot, we can then convert it back to a physical standby and it will once again be in sync with the primary database. This is accomplished by taking a restore point when the snapshot conversion happens. The whole time the standby is in snapshot mode, the redo data is still being shipped from the primary. However, it is NOT being applied. Once we convert back to physical, the restore point is used to restore and then all waiting redo is applied.

Pluggable Databases

With Oracle 12c, we have the introduction of multitenant architecture. This architecture consists of Container(CDB) and Pluggable(PDB) databases. This setup makes consolidating databases much more efficient. It also gives us the ability to clone a PDB very easily. Cloning a PDB between different CDBs can even be done via a database link. Having a true multitenant setup does require additional licensing, but you can have a CDB-PDB setup without this extra licensing cost if you use a single instance(Only one PDB per CDB). Here is a great video overview of multitenant.

Refresh Approach

Now that we have an idea of what these tools and features gain us, let’s think about how to put them to use with database refreshes. Both of these approaches assume the use of Data Guard and PDBs. Having a true multitenant setup would be most efficient but a single instance setup will work just fine. I would recommend you have a dedicated standby database for your refreshes, versus using the same standby you rely on for HA\DR. It would also make sense for the standby to be located on the same storage as the PDBs you will be refreshing. Neither of these are requirements, but I think you will see better performance and lessen the risk to your HA\DR plan.

The use case we will use for this example is a sandbox PeopleSoft database. This sandbox will be scheduled to refresh nightly, giving the business an environment to test and troubleshoot in with data from the day before. The refresh could also be run adhoc, if there is a need during the business day. So the goal is to have this fully automated and complete as fast as possible.

Clone Standby Approach

This approach will be to take a snapshot of our refresh standby database and clone it, overlaying our previous sandbox PDB. After this is completed, we will need to run custom SQL scripts or ACM steps to prepare the refreshed PDB. Finally, we will restore the refresh standby back to a physical standby database. This blog post by Franck Pachot gives a quick overview of the SQL commands needed to accomplish most of these steps.

  1. Convert the refresh source physical standby to a snapshot standby.
  2. Open the refresh source PDB as read only.
  3. Create database link between the sandbox target CDB and the refresh source PDB.
  4. Drop the sandbox target PDB and create a clone from the refresh source PDB.
  5. Open the new clone sandbox PDB.
  6. Cleanup the sandbox PDB.
    • Check for errors.
    • Patch the PDB to the patch level of the CDB, if needed.
  7. Run custom SQL scripts or ACM steps against sandbox PDB for PeopleSoft setup.
  8. Convert the refresh PDB back to physical standby.

Snapshot Standby Approach

This approach is somewhat similar, except we won’t be doing any cloning. Instead, we will be using the actual snapshot standby itself as our database. Since we know this sandbox database will be refreshed nightly, we can stay in snapshot standby mode all day and then switch to physical standby mode briefly at night, applying redo data to sync up with our primary production database. After that is done, we will then switch back to snapshot mode and run our custom SQL scripts and ACM steps. This will require a dedicated standby database and should only be used with a frequent refresh schedule. Since the redo data continues to ship during snapshot standby mode, the redo data will start to backup. The volume of this redo data backing up could become an issue if it gets too large, so you will need to do some analysis to make sure you can handle it based on your refresh interval.

  1. Create a sandbox PDB as a physical standby, with primary database being production.
  2. Convert sandbox to a snapshot standby.
  3. Run custom SQL scripts or ACM steps against sandbox for PeopleSoft setup.
  4. Use the snapshot standby sandbox PDB as your normal database; connecting app and batch domains, etc.
  5. Wait until next refresh interval.
  6. Convert sandbox from snapshot standby to physical standby.
    • Restore point will be used and redo data applied, syncing up with current primary database state in production.
  7. Covert sandbox from physical standby to snapshot standby.
  8. Run custom SQL scripts or ACM steps against sandbox for PeopleSoft setup.
  9. Repeat.

Conclusion

Those are just two ideas, but you can see that there are probably many variations of these approaches that will work. Leveraging Data Guard and PDBs really gives you many options to choose from. I have been using the Clone Standby approach recently and have packaged up the scripts, including bouncing app\batch domains, in Oracle Enterprise Manager as a job. This gives me push button refreshes with a turn around time under 20 minutes. I have been able to provide adhoc refreshes for emergency production troubleshooting to the business multiple times in just a few months since implementing this approach. This is a very powerful tool to have and is well worth the effort to get your refreshes fast, efficient and automated.

#63 – Revisiting PS_APP_HOME

This week, Dan and Kyle talk about writing PTF tests for PeopleTools, running multiple IB domains and trying A CM again. Then, we revisit our strategies for managing PS_APP_HOME when applying selective maintenance.

Show Notes

#62 – PeopleTools Patch Testing

This week on the podcast, Dan and Kyle talk about load balancing all environments or some environments, Diagnostic Plugins and syntax coloring code. Then, they dive into the getting current and how to test PeopleTools Patches.

Show Notes

Announcing the Deployment Packages Course

We are excited to announce a new course, Mastering PeopleSoft Administration: Deployment Packages. This course will teach you how to make Deployment Packages (DPK) work for you. We begin the course by exploring the different components that make up the DPK, how you can configure the DPK, and how to enhance it. Then, the course shows you how to extend the DPK beyond it’s current functionality and use it to build servers exactly the way you need them.

Out of the box, the DPK is great for building demo environments. But most organizations have requriements for their environments that go beyond what the DPK can do. Those requirements can include custom signon pages, deploying additional software, and more. Learning how to use the DPK to deploy those requirements as you build environments can significantly benefit you. With the DPK, environment and server builds are repeatable, consistent and fast.

To learn more about the course, check out the videos below.

  • A course introduction video
  • A segment from the podcast where Kyle and Dan discuss why we built the course and what you can learn from it
  • A video on why you should buy a course from psadmin.io
  • Dan’s 48-minute talk on Enhancing and Extending the DPK

Last, there are 5 free lectures from the course available to watch right now. Click here to visit the course page and make sure to watch all the free lectures.

Course Introduction

 

Podcast Segment about the Deployment Packages Course

Why Buy a psadmin.io Course

 

Enhancing and Extending Deployment Packages Talk

Dan presented at the Upper Midwest Regional User Group about Enhancing and Extending the DPK (October 2016). This 48 minute talk goes over some DPK basics, introduces configuring the DPK and your possibilities for extending the DPK. If this talk is exciting and you want to know more, The Deployment Packages Course will go into the details of this talk and show you how to accomplish it all.