By one unique metric, we could approach technological singularity by the end of this decade, if not sooner.
A translation company developed a metric, Time to Edit (TTE), to calculate the time it takes for professional human editors to fix AI-generated translations compared to human ones. This may help quantify the speed toward singularity.
An AI that can translate speech as well as a human could change society.
In the world of artificial intelligence, the idea of “singularity” looms large. This slippery concept describes the moment AI exceeds beyond human control and rapidly transforms society. The tricky thing about AI singularity (and why it borrows terminology from black hole physics) is that it’s enormously difficult to predict where it begins and nearly impossible to know what’s beyond this technological “event horizon.”
However, some AI researchers are on the hunt for signs of reaching singularity measured by AI progress approaching the skills and ability comparable to a human. One such metric, defined by Translated, a Rome-based translation company, is an AI’s ability to translate speech at the accuracy of a human. Language is one of the most difficult AI challenges, but a computer that could close that gap could theoretically show signs of Artificial General Intelligence (AGI).
“That’s because language is the most natural thing for humans,” Translated CEO Marco Trombetti said at a conference in Orlando, Florida, in December. “Nonetheless, the data Translated collected clearly shows that machines are not that far from closing the gap.”
The company tracked its AI’s performance from 2014 to 2022 using a metric called “Time to Edit,” or TTE, which calculates the time it takes for professional human editors to fix AI-generated translations compared to human ones. Over that 8-year period and analyzing over 2 billion post-edits, Translated’s AI showed a slow, but undeniable improvement as it slowly closed the gap toward human-level translation quality.
On average, it takes a human translator roughly one second to edit each word of another human translator, according to Translated. In 2015, it took professional editors approximately 3.5 seconds per word to check a machine-translated (MT) suggestion — today that number is just 2 seconds. If the trend continues, Translated’s AI will be as good as human-produced translation by the end of the decade (or even sooner).
“The change is so small that every single day you don’t perceive it, but when you see progress … across 10 years, that is impressive,” Trombetti said on a podcast in December. “This is the first time ever that someone in the field of artificial intelligence did a prediction of the speed to singularity.”
Although this is a novel approach to quantifying how close humanity is to approaching singularity, this definition of singularity runs into similar problems of identifying AGI more broadly. Although perfecting human speech is certainly a frontier in AI research, the impressive skill doesn’t necessarily make a machine intelligent (not to mention how many researchers don’t even agree on what “intelligence” is).
Whether these hyper-accurate translators are harbingers of our technological doom or not, that doesn’t lessen Translated’s AI accomplishment. An AI capable of translating speech as well as a human could very well change society, even if the true “technological singularity” remains ever elusive.
Harpo Mander was six years old when she started translating for her parents, who only spoke Punjabi and needed help understanding English and Canadian life.
Mander was tasked with everything, from comprehending medical appointments to ordering take out.
Now 26, she says the experience forced her to step into adult life early.
"You don't have the bandwidth and you don't have the intellectual capacity to do these adult things, and yet you're being asked to do them at age five, seven, 10," she told The Early Edition host Stephen Quinn.
"It makes you a lot more mature for your age, and it makes you a lot more emotionally intelligent."
Mander is considered a 'translator kid' — children asked to be interpreters for their families from a young age.
Daphne Tse started translating for her parents around age eight — about the time her older sister moved out.
Now, at 26, she's still doing that work.
"It takes a lot of empathy and emotional energy that I think a lot of people don't understand," she said.
She said public services are typically inaccessible for Canadians who don't speak English.
For example, she's currently helping her father apply for Old Age Pension. Aside from the translation, she said the technology has been challenging to navigate — and she works in the tech industry.
"If I wasn't here, what would they have done?"
'Migrants feel like their English is not good enough'
JP Catungal says he was — and still is — a translator kid at 38.
He and his family came to Canada from the Philippines when he was 14. Although his family spoke English, they still struggled to navigate Canadian English and other aspects of Canadian society, he says, which meant he had to help.
"It's not necessarily a lack of proficiency in English that requires the translation work," he said.
"My parents and a lot of migrants feel like their English is not good or good enough because it's not the right type of English that is valued or understood here. There's a kind of hierarchy of English. There's a racialization of that kind of English that they speak."
The Early Edition8:01Event called "Translator Kids" happening later today
Whether it's calling the internet company, or going with your parents to get a prescription refilled: children of immigrants often have to take on extra tasks because they're the only English speakers in the family. We hear more about what it's like to be a "translator kid."
The issue, he says, is that systems in Canada — including health care, the justice system and finances — aren't built for immigrants who use the language differently, or do not speak it at all.
He's working with the Hua Foundation in Vancouver to come up with tools for institutions to deal with language barriers without having to involve children, giving adults the agency to understand and navigate those systems themselves.
'A very convenient way to get interpretation services'
There are some companies in North America trying to alleviate the experiences of translator kids, including one called Language Line Solutions, based in California.
The company offers translation services for more than 200 languages in seconds by phone, using professional interpreters.
Despite their services and others, Chief Marketing Officer Suzanne Franks says she often sees children acting as translators for families.
On The Coast5:26How LanguageLine Solutions eases families' dependence on 'translator kids'
Suzanne Franks, chief marketing officer at LanguageLine Solutions, explains how their third-party service reduces immigrant families' reliance of their children if the parents don't speak English.
"It seems to be a very convenient way to get interpretation services just to ask your child to do it," she told On The Coast host Gloria Macarenko.
"Children can often not interpret correctly, they don't have the emotional maturity and the vocabulary to handle some of these very difficult conversations," she said.
"Consequently, the parent can't ask the next best logical question and get a real understanding of what needs to happen to rectify whatever situation they're in."
Enhancing communication skills
Tse says while it was isolating for her and she felt she had to mature quickly, it was even more isolating for her parents.
But being a translator kid wasn't all bad, according to Mander, who says she learned how to become a strong communicator and that it made her more empathetic.
"You have to sort of test the emotions of the people that you were translating for and then also be able to understand the emotions and the frustrations of the people that you were translating on behalf of," she said.
A Wordle clone called Quordle has been acquired by dictionary Merriam-Webster.
Many spin-offs have launched since Wordle went viral last year, including math variant Nerdle, and others including Absurdle and Dordle.
READ MORE: Why ‘Wordle’ is all over your timeline
Quordle asks users to solve four puzzles at once, and was purchased by Merriam-Webster approximately a year after The New York Times acquired Wordle.
“I’m delighted to announce that Quordle was acquired by @MerriamWebster,” the game’s creator, Freddie Meyer, recently announced on Twitter. “I can’t think of a better home for this game. Lots of new features and fun to come, so stay tuned!”
I'm delighted to announce that Quordle was acquired by @MerriamWebster
I can't think of a better home for this game. Lots of new features and fun to come, so stay tuned!
— Quordle (@quordle) January 20, 2023
Last year, The New York Times hired a dedicated Wordle editor to ensure the word list remains “fun, accessible, lively and varied.”
Originally launched in 2018, Wordle’s daily five-letter word has always been chosen from the same list of 2315 words. Now, a new Wordle editor will curate that list.
“After nearly a year of speculation, it will finally be our fault if Wordle is harder,” said owners The New York Times.
“Wordle‘s gameplay will stay the same and answers will be drawn from the same basic dictionary of answer words,” the announcement continued.
However Wordle will now come “with some editorial adjustments to ensure that the game stays focused on vocabulary that’s fun, accessible, lively and varied”.
“While the answer list is curated, the much larger dictionary of English words that are valid guesses will not be curated. What solvers choose to use as guess words is their private choice.”
The following is a transcript of an interview that aired on Information Morning Cape Breton on Jan. 23. This interview has been edited for clarity.
Host Steve Sutherland: Earlier this week on Information Morning, we had an interview with George Paul, the writer of the Mi'kmaqHonourSong, abouta new children's book that shares the story of the song.
Today, we're talking about another project to help share theHonour Songwith another audience: people who are not able to hear it.
The Honour Song translation project is a collaboration of deaf Mi'kmaw signers, Mi'kmaw elders and others to translate the song into sign language.
One of the Mi'kmaw translators who helped create the sign language version is Holly Green.
Holly also performs the sign language version in a video created by the Nova Scotia Community College which was shown at the Atlantic International Film Festival.
Also joining us is Denise DiGiosia. She is senior adviser for Mi'kmaw Indigenous initiatives, human rights, & equity and inclusion for NSCC. She is also Mi'kmaq.
And by the way, I should mention that Holly is deaf. We are working with an ASL-English interpreter throughout this interview, and actually we can all see each other on Zoom.
Holly and Denise good morning, Kwe'.
Information Morning - Cape Breton19:35Translating the Mi'kmaq Honour Song into sign language
Making the Honour Song accessible to a new audience. We learn about a project to translate the Honour Song into a form of sign language that includes the use of old Mi'kmaq signs.
Denise: Weli eksitpu'k.
Steve: Maybe we'll start with you, Denise. Could you tell us a little bit about NSCC's involvement? How did this project come together?
Denise: Well, I will share with your audience that I've just been in this role [for] over a year. This project had some roots before I arrived at Nova Scotia Community College and it was brought to me by one of our translators that was working with sign language at the college for events like graduation, convocation — different events that the Mi'kmaq Honour Song was being played. [The translator] expressed how there was a gap in being able to sign O Canada, but when it came to signing the Mi'kmaq Honour Song, they had to leave the stage.
There was no official sign in Mi'kmaw and it was flagged [to] me that this was an opportunity. Nova Scotia, at the time, was just starting to enact its legislation to adopt Mi'kmaw language as its official first language. So it was really important in acknowledging the opportunity for NSCC to honour and implement some of the calls to action in the [Truth and Reconciliation Commission] of creating that opportunity for language to be developed and also to bring community together in this work, so we jumped on it immediately. I thought that it was a very important act of reconciliation for the college.
Steve: Wow, that's really interesting. So your ASL interpreter said there was no way to interpret theHonour Songinto sign language and had to leave the stage afterO Canada,that's really interesting. So Holly Green, you are one of the interpreters and signers. Before getting into that process, tell me a little bit about you. You actually use Mi'kmaw sign language for theHonour Song?
Holly: Yes, so for myself, my father is Mi'kmaq, but I was raised by my mother who is a white woman, so I lost a lot of that part of my life until I would say about 10 years ago. I started on this personal journey to find out who I was and where I'm from, and when I started asking some questions about my history and started finding out a little bit more, I connected with some elders and with some of my friends growing up and I started learning where my family was from and a little bit more about my background.
This project came into my life about two years ago and it was a great opportunity for me to do some more of that exploring and more of that learning about my culture and my history. I can feel that history and that passed-down tradition, but I didn't have it growing up. The other deaf translator, Sheila Johnson, [had] a lot of the Mi'kmaw signs passed down to her. I didn't have a lot of them. So we did this together,understandingthat Mi'kmaq is an oral language that also has a written component sometimes, so we're learning more and more as I go along. I'm happy to be able to show [this] to a wider audience, while working with Sheila Johnson.
Steve: So what is the Mi'kmaw sign language?
Holly: Sure. It's a really old language. If you think about spoken Mi'kmaw at the time when people were still hunting and were still out on the land existing in this way, there were deaf people there too, and people were communicating with spoken language, with sign language, all sorts of different varieties of a mixture of spoken and language together, and that's really where the old Mi'kmaw signs come from. Sheila has a deaf family and has a lot of those signs passed down in her family, and that's a lot of what we relied on as well.
There's a history of residential schools that has an impact on all of these languages, and we're trying to revitalize it now with this project.
Steve: Can you tell me a little bit about the process, how you take the Mi'kmaqHonour Songand turn it into sign language? What was the process?
Holly: It was a little bit of a complicated process because we are talking about translating the Mi'kmaw sung language into signs and how to make those connection points there. So you do have to use colonized languages as a bridge to get there, but we really didn't want to dwell in those spaces for too long, so we spoke with the community, with language keepers, what the words really meant and tried to find equivalents to signs to convey those meanings. So we worked just to feel what the feeling was in each of the words, in each of the concepts, and between the two of us, Sheila and I, we talked about what signs would be the most conceptually accurate to depict those meanings.
Steve: And just out of curiosity, do you also use American Sign Language?
Holly: Yeah, especially during the translation process, we had a lot of those conversations in ASL and we had Mi'kmaw elders, who were hearing people, who spoke to us in English, using ASL-English interpreters and Mi'kmaq spoken language, as well. So we were using all of these languages that we had available to us, plus Sheila and I, taking all of that information, using the old Mi'kmaw signs and the sign language that she has to come up with the language that you see in the Honour Song.
It was a large group of us together. It was a little bit of a complicated process with Mi'kmaw language speakers, sign language interpreters working in the room, and all of us coming together to create this. It was a complicated process, but an enjoyable one.
Steve: So, what's a phrase whose evolution to what it ended up being in sign language was interesting to you that you could tell us about?
Holly: If I think about the structure of the Honour Song, toward the end of it, we talked about one of the phrases that was — it could have a very broad meaning, but we couldn't think about a specific sign to use with it. In the song, it flowed really well and we wanted to make sure it flowed really well in the signs as well, so we put it on pause for a moment and then we talked about the rest of the song. Conceptually, we got the song all together. Sheila and I talked about it for an hour and we moved on to a different part and then we went back up to revisit that one part that we were stuck on and then it came to us. It was really, really tough because when you think about Mi'kmaw spoken language and then talking about it in English and then figuring out what Mi'kmaw signs we wanted to put to it, while releasing ourselves from all of that very, very specific words throughout the whole process was really, really difficult.
Steve: What was the concept you were conveying and what's the sign you came up with?
Holly: The one in the song is when it talks about the roots and how people are all interconnected. So we brought all of our people together, they became roots conceptually and then they were brought up to the creator the same way that the honour comes from and conceptually that kept the tone of the song all the way around. That was the part that I really, really liked. It was a really tough part as well, especially considering that we were going back and forth between four languages the whole time: the spoken Mi'kmaw, spoken English, American Sign Language, and the Mi'kmaw signs you see in the video.
Steve: You're listening to Information Morning. We're at 92.1 FM in Eskasoni. We're having a conversation with Holly Green, who is a Mi'kmaw translator. Holly was one of the people involved with translating the Mi'kmaqHonour Songinto sign. We're also speaking with Denise DiGiosia, who is with NSCC and was involved in this process.
Holly, I saw the video that was produced for the song which recognized some of the drummers and dancers, too, from Eskasoni and it was interesting when the drummers were chanting, there was a sign that you used. The back of your right hand on the open palm of your left hand. over and over.
Holly: Yeah — I can talk about that. It's really about your heart and when they're chanting and we could see them doing that. We could feel them doing that. For me and how they explained it as well, is we really wanted to show the heartbeat of it. So if you think about the feeling and signs being visual and how it's a chant, we really wanted to express that in sign language as well and keeping the heart as part of it.
Steve: So Denise, what's your thoughts on how the process turned out and where theHonour Songgoes from here?
Denise: It was really important to acknowledge — as we started doing this work in a way that transformationally has never been done, in engaging the community, honouring and acknowledging Indigenous intellectual property — it was really important for us to work directly with George Paul and he became quite important in helping us design the visuals on how, in his mind, this song looked, so a lot of the creative components of this are his, using his guidance.
What I'm really most touched by and impressed by is the ability of how it was difficult work, but pulling and centring [the] community and making sure that they led this and that we had people like Holly and Sheila, who are the leaders in this work, acknowledging the value that they hold in their lived experience, in their expertise. [That] was important for me to centre.
When Holly talks about the heartbeat, that's something that I don't think hearing communities acknowledge — that when we see a drum, we can feel it. It resonates in our body. That's not an experience that's shared with deaf communities. So to us, traditionally in our teachings, a drum is our heartbeat, our connection to the land. It replicates the sound that a baby would hear in the womb, so it connects us. It's intersectional and being able to bring that to a reality of a community that would never experience that — to me — is really what is the centre of all of this work, is that inclusive, equitable work that I really hope and dream for and intend to lead the college in.
Steve: Holly, a question for you that is kind of an odd question, I guess, because some of the language used was Mi'kmaw signs that are very old and as you say, not everybody knows them anymore. Some of them you had to collaborate on with Sheila, with elders, with others, to kind of get across a concept rather than a specific transliteration. If I speak ASL, or if I use ASL, and I'm a deaf person and I see it performed with the signs that you and Sheila ended up with, will I understand the language?
Holly: It's a very visual language, so yes, it's accessible to deaf people. Are they going to understand it completely? Probably not. Conceptually, it does make sense, but the words specifically, the same way that a hearing person who speaks English can hear the Honour Song, recognize it as the Honour Song kind of gets the gist of it, a deaf person who uses ASL only would watch that video and understand it in the same way.
Steve: Fascinating. Thank you both very much. Good luck with the new chapter for theHonour Song. We'lalioq.
Holly: We'lalin.
Denise: We'lalioq.
Steve: Holly Green is a Mi'kmaw sign language translator who helped create a new sign language translation of theHonour Song. Denise DiGiosia is senior adviser of Mi'kmaw Indigenous Initiatives, Human Rights, Equity and Inclusion for NSCC.
By one unique metric, we could approach technological singularity by the end of this decade, if not sooner.
A translation company developed a metric, Time to Edit (TTE), to calculate the time it takes for professional human editors to fix AI-generated translations compared to human ones. This may help quantify the speed toward singularity.
An AI that can translate speech as well as a human could change society.
In the world of artificial intelligence, the idea of “singularity” looms large. This slippery concept describes the moment AI exceeds beyond human control and rapidly transforms society. The tricky thing about AI singularity (and why it borrows terminology from black hole physics) is that it’s enormously difficult to predict where it begins and nearly impossible to know what’s beyond this technological “event horizon.”
However, some AI researchers are on the hunt for signs of reaching singularity measured by AI progress approaching the skills and ability comparable to a human. One such metric, defined by Translated, a Rome-based translation company, is an AI’s ability to translate speech at the accuracy of a human. Language is one of the most difficult AI challenges, but a computer that could close that gap could theoretically show signs of Artificial General Intelligence (AGI).
“That’s because language is the most natural thing for humans,” Translated CEO Marco Trombetti said at a conference in Orlando, Florida, in December. “Nonetheless, the data Translated collected clearly shows that machines are not that far from closing the gap.”
The company tracked its AI’s performance from 2014 to 2022 using a metric called “Time to Edit,” or TTE, which calculates the time it takes for professional human editors to fix AI-generated translations compared to human ones. Over that 8-year period and analyzing over 2 billion post-edits, Translated’s AI showed a slow, but undeniable improvement as it slowly closed the gap toward human-level translation quality.
On average, it takes a human translator roughly one second to edit each word of another human translator, according to Translated. In 2015, it took professional editors approximately 3.5 seconds per word to check a machine-translated (MT) suggestion — today that number is just 2 seconds. If the trend continues, Translated’s AI will be as good as human-produced translation by the end of the decade (or even sooner).
“The change is so small that every single day you don’t perceive it, but when you see progress … across 10 years, that is impressive,” Trombetti said on a podcast in December. “This is the first time ever that someone in the field of artificial intelligence did a prediction of the speed to singularity.”
Although this is a novel approach to quantifying how close humanity is to approaching singularity, this definition of singularity runs into similar problems of identifying AGI more broadly. Although perfecting human speech is certainly a frontier in AI research, the impressive skill doesn’t necessarily make a machine intelligent (not to mention how many researchers don’t even agree on what “intelligence” is).
Whether these hyper-accurate translators are harbingers of our technological doom or not, that doesn’t lessen Translated’s AI accomplishment. An AI capable of translating speech as well as a human could very well change society, even if the true “technological singularity” remains ever elusive.
Quordle is one of the more popular and difficult Wordle clones out there. Instead of solving just one linguistic brain teaser a day, you have to solve four puzzles simultaneously. And now it’s owned by Merriam-Webster, a dictionary company that appears to increasingly be pivoting from definitions to online games.
Off
English
“I’m delighted to announce that Quordle was acquired by @MerriamWebster,” the game’s creator, Freddie Meyer, recently announced on Twitter. “I can’t think of a better home for this game. Lots of new features and fun to come, so stay tuned!”
After the initial Wordle frenzy began to subside somewhat, some power-users started searching for weirder and more challenging spin-offs. There’s a math variant called Nerdle, one that keeps changing the answer called Absurdle, and many, many more. Then there’s a whole sub-genre of Wordle clones that just keeps stacking up more and more puzzles alongside one another. Dordle has you solve two. Duotrigordle has you attempt 32. At just four, Quordle has always felt like the “just right” Goldilocks amount of Wordle masochism.
And now it belongs to Merriam-Webster. Meyer didn’t immediately disclose the exact dollar value of the sale, but it comes roughly a year after The New York Times purchased the original Wordle for a reportedly low-seven-figure sum. The idea there was for the media platform to grow out its games section, an increasingly lucrative part of its business, even if it didn’t immediately stick Wordle behind a paywall.
G/O Media may get a commission
Up to $100 credit
Samsung Reserve
Reserve the next gen Samsung device All you need to do is sign up with your email and boom: credit for your preorder on a new Samsung device.
Merriam-Webster, which is actually owned by Encyclopædia Britannica, Inc., itself controlled by the Swiss investment banker Jacqui Safra, appears to be interested in a similar play. As PC Gamer points out, the dictionary company turned online reference depot already sports a number of other puzzle games, brain teasers, and knowledge tests. It’s probably a better reason to visit the site than hunting for a word’s meaning or a synonym that Google will serve you up instantly from competitor Oxford English Dictionary.
It’s pretty clear that cybercriminals are willing to go to some extreme lengths in order to hack accounts and gain access to sensitive data, but you might not have heard of a dictionary attack.
You should be aware of this method, though, especially if you want to keep your accounts secure – and especially if you’re worried about the password security that your family and friends use.
In essence, a dictionary attack is a type of brute force attack, but it uses recognizable words rather than strings of random letters, numbers and symbols. And when many inexperienced tech users create passwords that use proper words, that can cause big problems.
We’ve explained dictionary attacks here, highlighted how they differ from other hacking techniques, and delved into the best methods for preventing these nefarious attacks. And if you’d like even more security advice, head to our explainer on endpoint protection and antivirus – or explore the best plagiarism-checking tools.
What’s a dictionary attack?
It might sound like the kind of thing that happens during an argument at a library, but a dictionary attack is actually a sophisticated method used by cybercriminals who want to gain access to your email accounts, bank details and social networks.
To understand a dictionary attack, though, you’ve got to get to grips with another kind of hacking method – the brute force attack.
A brute force attack attempts to guess someone’s password by systematically trying to every possible combination of numbers, letters and symbols. But while brute force methods can hack into accounts by finding passwords, the nature of the approach means it can take ages. Imagine how long even the best computers will take to motor through every possible combination of digits in a 20-character password.
A dictionary attack refines this method. Instead of trying every combination of letters, numbers and characters, a dictionary attack uses recognizable words and phrases instead.
This approach reduces the number of potential passwords a hacker has to try to get inside someone’s account. That cuts back on the amount of time and resources a hacker must deploy to get the job done.
It’s a worthwhile approach. While using words and phrases rather than every potential password combination does reduce the chances of a successful guess, it’s a quicker process. And when so many people use recognizable words in their passwords, it’s still worth trying to any hacker.
Hackers also refine the process. They’ll develop lists of words that are relevant to the account or person they’re trying to hack – based on location-specific phrases, the local sports teams, or any other information they’ve got about the person behind the account. Criminals will use terms that are specific to organizations if they’re trying to hack into company servers, or build lists based on the most common password terms.
In other situations, hackers will develop dictionary attack lists based on passwords that were already exposed in security breaches. They’ll also include sequential number sequences and other common characters in their list of attack words, so you may not necessarily be safe if you’ve added “123” to end of your password in order to try and make things more secure.
Dictionary attacks may not be the most effective way to hack an account, but they use fewer resources than brute-force methods. And when too many people reuse passwords, develop passwords based around common words and don’t practice good password security, it’s no wonder that hackers get results.
How can I avoid a dictionary attack?
As time goes on, hacking methods get more sophisticated – which means users have to fight back with increasing ferocity if they want to keep their accounts, details and financial information safe.
Thankfully, there are some simple rules to follow if you want to avoid becoming the victim of a dictionary attack.
The first thing you should do? Eliminate real words and phrases from your passwords, and make sure you don’t have strings of often-used numbers and letters like “123” and “QWERTY”. Dictionary attacks rely on people lazily using these tropes, and avoiding them is the best way to stay protected.
Instead of going down that route, use long passwords with randomized collections of letters, numbers and special characters, and make sure you’ve got a unique password for every account.
That might sound daunting – especially the prospect of remembering all those passwords – so we always recommend that people use a password manager, too. We’ve rounded up the best password managers here. The top options won’t just save all your passwords – they also generate secure passwords, use encryption to protect your existing codes, and highlight any weak passwords you should change.
When an account offers it, you should also deploy multi-factor authentication. This method uses biometric methods like fingerprint recognition alongside third-party apps and text codes to add an extra layer of security to any account or service. Essentially, it means that a hacker can’t reach your account even if they’ve got your password – because they don’t have the extra bit of information required.
Elsewhere, you should investigate the settings section on each app and account. Lots of services enable you to lock people out if they make a certain number of unsuccessful login attempts, and they’ll often demand that the original owner resets their password. If you want to prevent hackers from having free reign to try and guess your password an unlimited number of times, that’s a smart move.
Our final piece of advice? Make sure you change your passwords frequently. A good password manager can help you in this department by saving passwords, generating new codes and providing reminders.
You certainly can’t stop every hacker, especially given how many password lists are leaked on the dark web, but if you change your passwords then you can stop someone getting into your account – even if they’ve got possession of a password that used to work.
Many security experts recommend changing your password every three months, and some services demand that you do this as an extra layer of security.
There’s no denying that dictionary attacks are worrying, especially when hackers are trying to access sensitive information, bank details, medical records or your email and social media accounts. But if you’re smart about your security and avoid common words, phrases and character sequences, you’ll stay safe – and the hackers won’t get far.