25 September 2017

two new articles

Posted by admin @ 11:35 am    categories: Psychology

I’m pleased to note that two new journal articles were published in the past few months. They’re based on research that I worked on over the past year (or, actually, 2-3 years).

One is entitled “Sustained attentional engagement is associated with increased negative self-referent decision-making in major depressive disorder” (Dainer-Best, Trujillo, Schnyer, & Beevers, in press). In this work, we helped solidify the relationship between depression and negative information processing; finding that people who were depressed responded to a task about self-referential stimuli differently behaviorally and in EEG.

The second paper is called “Specificity and overlap of attention and memory biases in depression” (Marchetti, Everaert, Dainer-Best, Loeys, Beevers, & Koster, 2018). Dr. Igor Marchetti was the lead on this project. Here, we used a commonality analysis to begin to tease apart the relationship between measures of depression symptoms and two types of cognitive biases: attention bias and memory bias. In this study, we found that the memory bias in mood-relevant stimuli was reliably related to depressive symptoms but not anxiety symptoms—it was specific here.

« « Older post |

20 December 2015

Data Encryption for Psychologists

Posted by admin @ 23:55 pm    categories: PsychologyUncategorized

During yesterday’s debate, there was a discussion of encryption—which seemed to be characterized by a moderator as some sort of terrorist tool. That’s not a reasonable way of looking at encryption, though, although encryption does, of course, enable privileged communications. In fact, encryption is important and quite common. Encryption is about encoding a message so that only the people who are authorized to read it can read it; the OED defines “encrypt” as “to convert (data, a message, etc.) into cipher or code, esp. in order to prevent unauthorized access”. The simplest kind of encryption is a code: think substituting numbers for letters, A=1, B=2. Think the coded letters in an Agatha Christie novel, or a spy novel.

But of course we’ve come a long way from a basic cypher. Certainly, computer algorithms have made encryption a lot easier. Rather than needing to follow a complex code on your own, you can plug a coded message into a simple program, and it can output the deciphered message. That’s what’s going on in PGP email encryption; that’s what’s happening every time you send an iMessage with your iPhone. Those are encrypted. (How secure they are is up for some debate.)

Before we go any further, let me include a brief disclaimer: I am not a lawyer. I’m writing this based on my understanding of the laws and technology. Because of how complex some of this was, and how rarely it seems to be discussed in the context of psychologists and counselors, I’m writing it out here. But recognize that this is at best educated advice, and that if you’re unsure about best legal practices, you should check with a lawyer before following anything specific. I’m also not an IT professional. While I’ve used many of the services discussed below, and am familiar with some practices, there are always risks inherent to digital information—so back everything up, and use strong, secure passwords that you can remember. You are responsible for any lost data.

HIPAA and Encryption

I’ve been thinking about encryption some recently, because I’ve been reading about HIPAA and its requirements for storing PHI1. HIPAA most often affects most people when they go to a doctor’s office for the first time, and are handed—along with the demographic form, and the questions about their insurance—a several-page form that explains when the doctor’s office will share their information, and when they won’t. But it’s more far-reaching than that. As wikipedia puts it, “Title II of HIPAA defines policies, procedures and guidelines for maintaining the privacy and security of individually identifiable health information as well as outlining numerous offenses relating to health care and sets civil and criminal penalties for violations”.

In essence, HIPAA says: “Hey healthcare providers, you need to keep identifiable information about your patients private, and only share it with other people if you need to or if you’re asked to by the patient. And if you do share it, you’ll need to let the patient know you’re going to do so.” As a clinician working in psychology, this is going to be boiled down essentially as: “There are a few ways in which I’m a mandated reporter, but otherwise everything we discuss is entirely confidential. I may tell someone [e.g., your insurance company] that you’re seeing me in order to receive payment, but otherwise I won’t release your records unless I’m subpoenaed or you ask me to do so in writing.”

There’s another side of this, though: what do you as a clinician do when you keep information about your patient? Traditionally, this was pretty damn easy: you kept paper notes. You kept their address and phone number in a file folder, and in that folder you took paper notes, kept everything together. If you saw someone for a long time, or if you ended up with documents, etc., you expanded your paper folder. It all went in a locked file cabinet, within your locked office or locked file room. There’s an ethical dimension to this, of course. Ethically, a psychologist is required to maintain accurate records and notes, and keep them private and confidentially. The APA explains that “psychologists protect electronic records from unauthorized access through security procedures (e.g., passwords, firewalls, data encryption and authentication). Consistent with legal and regulatory requirements and ethical standards, . . . psychologists employ procedures to limit access of records to appropriately trained professionals and others with legitimate need to see the records.”2 As you may have noticed, they also mention legal standards, so let’s check in: what does HIPAA mandate in terms of storing this kind of data:

Data Safeguards. A covered entity must maintain reasonable and appropriate administrative, technical, and physical safeguards to prevent intentional or unintentional use or disclosure of protected health information in violation of the Privacy Rule and to limit its incidental use and disclosure pursuant to otherwise permitted or required use or disclosure. For example, such safeguards might include shredding documents containing protected health information before discarding them, securing medical records with lock and key or pass code, and limiting access to keys or pass codes.

Digital Information Security

Having that locked file cabinet was considered to be “reasonable and appropriate”. But what about digital information? More and more records have become digital these days. There are a couple of reasons for this; for one, it’s more space-efficient, since records often need to be stored for years even after a patient is no longer receiving services (in Texas, it’s seven years3 ). For another, it’s often easier to just type out a few notes in a patient’s digital file than it is to hand write them, especially if you’re not taking notes during the session. Moreover, billing is often carried out on computers, rather than by hand. (Someone probably still bills by hand, but it’s rare if so.) Hospitals buy expensive software, and tie providers to login IDs that they use to update patient charts. But when it comes to smaller clinics, or clinicians in private practice, although there are many record-keeping and payment-processing programs available4 , things are a lot more up in the air. There are some clear answers, but there’s also a fair amount that people seem very unsure about.

HIPAA is relatively clear about whether those digital records need to be encrypted: no, not necessarily 5 . Wikipedia puts it a slightly different way: “Information systems housing PHI must be protected from intrusion. When information flows over open networks, some form of encryption must be utilized. If closed systems/networks6 are utilized, existing access controls are considered sufficient and encryption is optional.” Essentially, cloud networks, or data that’s being stored online in some way, needs to be encrypted. But if you’re saving patient data on your personal/work computer (and backing it up—it definitely needs to be backed up and that backup needs to be password-protected or stored behind lock and key, but I’m not going further into that here), so long as you’re doing that offline, it technically seems to be following the law.

If you’re emailing patients, that’s a little less clear. Guidelines are hard to find. Email is considered non-secure in most cases (barring encrypted email), but the obvious fact that patients often want to email with their providers makes things somewhat more complicated. This is also beyond the scope of this post, but a simple guideline is that additional PHI (beyond name and email address, which are already contained in any message) should be omitted from email wherever possible. The HIPAA guideline on emailing PHI states that “The Security Rule allows for e-PHI to be sent over an electronic open network as long as it is adequately protected.”

Let’s put it this way: if you have a password on your computer, that’s a key. If that password is required any time you shut your computer, then that severely limits access to any stored data—it “prevent[s] intentional or unintentional use or disclosure of protected health information”.

Examples of Good and Bad Practices in Data Security

The good: sufficient, reasonable, and appropriate:

  1. You have a work computer on which you store patient information. Any time the computer is not used for more than 10 minutes, it requires a strong password7 to access your user account. You are the only user of this computer. ✓
    Unless someone steals your computer and uses a brute force hack to figure out your password, your data is secured. This seems like a prudent method of data security.
  2. You have a laptop that you use for work, but also use it at home to stream videos and email. Your child or partner uses the laptop too. Therefore, you have two accounts on the computer. You sign into one with a strong password, as above, and use that for work. Your personal account either has a short password everyone in your family knows, or is completely unsecured, but no-one else knows your work account’s password. You always sign out of your work account when you’ve finished. ✓
    Using two accounts means that your data is not going to be unintentionally affected by another user. Your data is secured on a password-protected account that is only used for work.
  3. You use a shared computer (or several shared computers) at work at a small clinic. You sign into a password-protected account to write notes and store those notes locally, on the computer on which you write them. Others who use the computer have their own, separate, password-protected accounts. Files are backed up regularly, in a password-protected backup, on an external hard drive. For billing purposes, you and your colleagues update a shared, password-protected database with client names and the sessions they have attended. This database is stored locally on an external hard drive, and never sent via email. (Alternatively, it is stored on an encrypted cloud service.) All passwords are only known to those who need access to files. ✓
    Keeping files stored locally provides good security, and using strong passwords means that clinicians don’t have access to files they don’t need. Keeping a database is of course often necessary when providing services, but keeping it password-protected means that it is as close to locked as possible. Emailing it might reduce security of this file, since email servers are often unencrypted. However, using an encrypted cloud service8, or keeping the file stored locally, seems sufficient.

The bad: insufficient protection:

  1. You have one laptop computer which you use for notes and billing. You only have one account which is password-protected. However, your child or spouse knows your password, and occasionally uses your computer. ✗
    Obviously, this is a parallel to the second example above. Why is it not secure? Because the family member has access to the PHI. Obviously, this situation is better than no password at all, but this is certainly not following best practices.
    How do we resolve this? This could be readily fixed in several ways: you could make a second account for your family members, and only use this account for work. You could also use a service that encrypts specific folders on your laptop so that only someone with an additional password is able to access those files.
  2. You have a desktop computer, or several computers, which is/are used by all staff at your work. There is one primary account on these computers. Clinicians sometimes enter PHI onto the shared account on these machines. Because the machines are shared, they are left logged in all day. ✗
    This is a more complex situation that’s harder to resolve. Why is it not secure? Because clinicians share credentials to access the machines. Moreover, although there may be a secure password, machines are left logged in at all times, and thus could be accessed by unauthorized users.
    How do we resolve it? With one computer only, creating individual accounts for every clinician might be best; a shared local drive could be used for storing information that needs to be shared between staff members (e.g., a database). With several computers, all used interchangeably by staff members, solutions may include individual accounts on every computer, web-based note / billing options, or individual accounts on an encrypted cloud service (see note 8) that users can sign into over the web.
  3. As in the third example above, you work in a small clinic and have shared computer(s). Users have their own accounts, with strong passwords. All users’ account passwords are kept in an spreadsheet by the clinic director. A database for billing and administrative purposes is kept online using an encrypted cloud storage service, and is password-protected. ✗
    This sort of situation is quite common. Why isn’t it secure? Because the passwords to all of the accounts are stored in plain text in a file somewhere. (Many real life situations are worse, and involve sticky notes with passwords placed next to computers.)
    The resolution is simple: don’t store passwords in print or in files on a computer. There are excellent and free password-managers which protected a number of passwords (i.e., here, the individual user accounts) by using a single, strong password9 . Alternatively, it’s possible to create a single computer admin account which has access to individual user’s accounts.
  4. You keep hand-written notes for all clients in a locked filing cabinet in your office; the key is kept on your key ring. You use a database with client PHI including names, addresses, and phone numbers. The database is stored on a password-protected account on Google Drive, or stored using email. ✗
    Why isn’t this secure? Although your paper files are secure, the client database is not secure. It’s easy to resolve, though. A database containing PHI either needs to be kept locally (not on the internet), or kept on a secure, encrypted cloud storage service (again, see note 8).

Recommendations for Good Data Security

Obviously, I’ve made some sort of recommendations above in discussing good and bad examples of data security. Nonetheless, below are some guidelines for storing electronic patient data ethically and legally.

  1. Anyone accessing a computer that they will use to store PHI should have an account on that machine with a strong password, that is only used for work purposes.
  2. If you have a private practice, use a work account for billing and notes. If you work in a clinic, create individual accounts for every clinician.

  3. Encrypt PHI whenever it is being sent or stored online.
  4. Databases or files that are sent via email should be password-protected, and stored on an encrypted cloud-based service, rather than sent via email. (For example, a password-protected database containing client information is stored on encrypted cloud service X. It needs to be shared with person B, and a link to that web service is emailed to them, but they then use their own credentials to sign into that web service to access the database.)

  5. Back up everything in a password-protected backup.
  6. Digital information should be backed up regularly, since it’s very easy to lose files, or have a problem with your hard drive. You can do this securely in a few ways:

    1. Back up to an external hard drive, and keep that hard drive locked in a filing cabinet.
    2. Make an encrypted backup to an external hard drive. (Unfortunately, full backups like Time Machine on the mac are not by default encrypted, or even protected by a password.)
    3. Back up your important files to an encrypted cloud-based service.

  7. Make sure patients know that email is unsecure, and use email as little as possible.
  8. Emailing with patients is definitely okay by HIPAA, but patients who want to email personal information need to be made aware of the fact that email is not confidential. (And no, adding a disclaimer at the end of your email doesn’t do much. Some quick web-searching will turn up plenty of articles explaining why not. An actual conversation with patients may be warranted.)

  9. Where possible, use file-level or disk-level encryption to encrypt even data stored locally.
  10. Newer Macs have FileVault and newer Windows computers have BitLocker, which are both system-level encryption (once you’ve logged in, your files are unencrypted). You can use software like VeraCrypt, DiskCryptor, or AxCrypt to encrypt files on a smaller level (you can read more about these applications here, or elsewhere on the web; these types of software bundle files together, much like putting them into a folder that needs a password to be opened (but a lot more secure). Either adds an additional layer of security, much like having a locked file cabinet inside of your locked office.

File encryption doesn’t appear to be strictly necessary when it comes to HIPAA-compliance, at this point. That may change, however. As encryption becomes easier—and more user-friendly—and as more of our lives is taken online, encrypting PHI and information relating to patient care may increase in importance, to keep more information secure. As a patient, I would want to know that my doctor or my psychologist was taking pains to not just keep my data relatively secure, but as well-protected as she or he knew how. As such, keeping such information encrypted, such that it can’t be hacked and certainly can’t be accidentally accessed, seems more and more important.

  1. There are 18 identifiers included under PHI, or Protected Health Information. These are (thanks, wikipedia): Names; geographical identifiers smaller than a state; dates (other than year) directly related to an individual; phone numbers; fax numbers; email addresses; Social Security numbers; medical record numbers; health insurance beneficiary numbers; account numbers; certificate/license numbers; vehicle identifiers and serial numbers, including license plate numbers; device identifiers and serial numbers; URLs; IP addresses; biometric identifiers; full face photographs; and any other unique identifiers. []
  2. American Psychological Association. (2007). “Record Keeping Guidelines.” American Psychologist, DOI: 10.1037/0003-066X.62.9.993 []
  3. Based on the DSHS website. []
  4. There are tons of online/software options for note-taking, billing, and client management. They’re often referred to as dealing with “mental health EHR” or “electronic health records”, and searching online for comparisons will provide lists. They provide such services as data/notes storage, encryption, and billing; some are definitely better than others. []
  5. The full text of that page from the HHS website is as follows: Is the use of encryption mandatory in the Security Rule?

    Answer: No. The final Security Rule made the use of encryption an addressable implementation specification. See 45 CFR ยง 164.312(a)(2)(iv) and (e)(2)(ii). The encryption implementation specification is addressable, and must therefore be implemented if, after a risk assessment, the entity has determined that the specification is a reasonable and appropriate safeguard in its risk management of the confidentiality, integrity and availability of e-PHI. If the entity decides that the addressable implementation specification is not reasonable and appropriate, it must document that determination and implement an equivalent alternative measure, presuming that the alternative is reasonable and appropriate. If the standard can otherwise be met, the covered entity may choose to not implement the implementation specification or any equivalent alternative measure and document the rationale for this decision. []

  6. A closed system here doesn’t seem to mean a system that never accesses the internet, although that would be the ideal, but rather at least one where the files are stored locally and not accessed by anyone who doesn’t have physical access to the machines. []
  7. It seems like a lot of people have trouble understanding what a strong password is. If you’re interested in reading about the topic; a quick internet search for “strong password security” will bring up a bunch of articles, e.g., this one. There’s also the xkcd “password strength” webcomic, which has a pretty strong method that essentially involves stringing together words for password strength. []
  8. These days, more cloud services are encrypting data when it is sent to their servers, and while it “rests” on those servers. Google Drive continues to be entirely unencrypted (as of this day in December in 2015), and is thus not considered at all secure for patient data. Other services (e.g., Box.net, Dropbox) appear to have encryption, and there are also secondary services like BoxCryptor which encrypt files before they are uploaded to those cloud services. These services are relatively inexpensive. Their use with PHI requires signing a business associate agreement (BAA), presumably because they are storing data and metadata with PHI. Box.net officially describes themselves as HIPAA compliant, and Dropbox does as well. For someone in private practice, or a small clinic, the BAA requirement may be somewhat complex, but seems to be necessary for compliance.

    BoxCryptor, or other services that encrypt data locally, appear to side-step this problem, as they encrypt your data and never have access to it themselves. BoxCryptor’s website explains (minor modifications mine): “Boxcryptor encrypts the files locally on the user’s device. The encrypted files are only stored on the user’s device and then synchronized to the [cloud storage] provider of choice. Moreover, all sensitive user information (e.g. private keys etc.) is encrypted on the user’s device before [being] uploaded to our servers. So although BoxCryptor is optimized for cloud storage, it does does not hold any PHI on its servers. Despite this, we sign Business Associate Agreemets (BAA) at no additional cost.” As such, one could also use services like VeraCrypt, DiskCryptor, or AxCrypt, which can encrypt files or partitions, and then upload those encrypted drives to the cloud while complying with HIPAA. []

  9. PCMag has a summary of free password managers: Rubenking, N.J. (2015). “The Best Free Password Managers for 2015.” []
| More recent post » »

This is an online journal for Justin Dainer-Best. Immediately to the right are links to other parts of the site.

This blog is being re-developed. Many posts to come will relate to my work in psychology. Older posts are now private; please contact me if you have any questions.

To syndicate, use RSS