Sunday, June 26, 2022

The Emergence Of Smart Agriculture & Its Impact On Cyber


I have to be honest here, I have no formal education or training in Cybersecurity.  All of the stuff that I have written, and posted, have all been self-taught.  In fact, I only took one class in computer science at Purdue, because I was an Ag Econ major.  Back in that day, email, the Internet, cell phones, etc. were all pretty much unheard of.  How did we get the job done back then?

Well, we formed study groups, and we actually learned how to communicate with another, face to face.  Ghosting people was very rare back then, unlike now.  If we had questions, we had to meet with our professors face to face, or attend help sessions. 

All the calculations we had to do were done by hand and a TI-30 calculator.  If we had to do research papers, well we had to find sources the old-fashioned way: Either searching through microfiche or the card catalog.

The closest we ever came to technological advancements was getting laser printers and getting Windows 3.1.  All of this I am telling you about is on the general level.  When it came to the Ag Econ classes, our common tools were the WSJ, textbooks, and the Lotus 1-2-3 spreadsheet application that was available. 

Back then, the big topics were international trade with 3 panel diagrams on a chalkboard, the sustainability of agriculture on global basis, the future of the plan of the family farm and how to adopt a succession plan, learning how to predict grain prices and basing trades off of them, learning how to interact and deal with prospects and customers, learning about the economic order of quantity, etc. 

Using technology in agriculture was totally unheard of.  Sure, we were taught to some degree how to read weather maps in order to determine how any dryness or rain would affect crop prices, but Cyber in the farm was something nobody even fathomed about.  Now let is forward some 30 years later, and the agricultural world has done a total 360 degrees in terms of technological adoption.

Ag producers are now using things like GPS and Drones to keep an eye on their fields, especially during the planting and harvesting seasons.  Heck, many more of them are now using smart devices to keep track of grain prices and even place futures trades.  In fact, there is now a term which has been thrown into the Cyber technojargon mix and this is called “Smart Agriculture”.

When I first came across an article with this new slang, I was just about to start to laugh.  Sure, we have heard about smartphones, smart homes, smart homes, etc. but smart agriculture?  Yep, its true.  So how does one exactly define it? It can be as follows:

“Smart farming refers to managing farms using modern Information and communication technologies to increase the quantity and quality of products while optimizing the human labor required.”


However, a big component of smart agriculture is the Internet of Things, or the IoT.  As I have written about in earlier blogs, this is where all of the objects that we interact with in both the physical and virtual worlds are all interconnected with another.  A good visual representation of smart agriculture can be seen below:


Although the smart agriculture has not taken off as quickly as other technological industries have, there is still great promise for it.  In fact, this market is supposed to hit a value of $15.3 billion by 2025, which is only 2.5 years away at the time of this writing.  More information about this can be seen at this link:

You may be wondering at this point; how can technology actually improve the state of agriculture and ag business?  It can make a huge impact, and the following are some examples:

*Collecting hard to track data like moisture levels, amount of raw input used in feeds, determining the quality of soil.

*Better tools for proper resource allocation.  In other words, making sure that the finite number of inputs are being used properly to produce the maximum grain yields possible.

*Using automation for such things as irrigation, and the spraying of pesticides.

*Making better planting decisions.  With IoT, ag producers can get very detailed and granular data to help make the best decisions possible, rather than relying simply on experience and estimating.

*Monitoring the climate:  Back in my day, the closest we ever came to any sort of accurate weather report was WGN 9 news and Tom Skilling.  But now, all of that has changed.  Ag producers now have access to very sophisticated sensors and controls to get a minute-by-minute play as to how their fields are doing under the current weather conditions.  They can also be set up so that crop conditions can be presented on a real time basis.

*IoT devices for the farm animals.  These can be installed into the pigs, cows, and other forms of cattle to get a constant gauge on their health.  After all, this is where we get the bulk of our food commodities from, so you want to make sure that your animals are all healthy before they reach the dinner plate.  By having them being monitored on a real time basis, you can ensure that this will happen.

*Using AI and ML.  These stand for Artificial Intelligence and Machine Learning, respectively.  If ag producers are taught how to use these tools effectively, it be a great boon them for them when it comes to determining the right feed mixtures, and even predicting the weather. 

My Thoughts On This:

Well there you have it, a broad introduction to smart farming.  But keep mind that while this industry is expected to grow, it will be one of the slower ones. One of the reasons for this is that ag producers, on a macro level, are slow and resistant to change.  But it will happen.  One thing that might propel the rate of adoption is Cybersecurity.

The United States food distribution is a complex one.  There are many air gapped systems here, just like we find in the Critical Infrastructure.  So, as ag producers want to keep adopting newer technologies, they need to be aware of the Cyber ramifications of that as well.  For example, one area of weakness is that of Endpoint Security, especially when it comes to the IoT.

Even in Corporate America, this is a forgotten about topic, and because of that, Cyberattackers are finding their way in pretty quickly, and move laterally in a covert fashion.  Another caveat for smart agriculture is distance. 

The ag producer will need to get tools and resources that can not only be easily reached over miles of farmland, but they need to be maintained as well. Covering this great distance could be a problem at first.  Also, data privacy and leakages will be an issue here as well.  Depending upon how much is being stored and processed, they could be subject to the provisions of the CCPA and GDPR.

But just ow far can the ag industry be made digital?  Only time will tell.

Saturday, June 25, 2022

The 5 Worst Mistakes You Can Make In IAM & How To Fix Them


Over the years, I have written a ton of content ranging from Biometrics to Cybersecurity, with everything and anything in between. This has resulted in the publication of 9 books, and some 20 eBooks, with this group growing quickly (thanks to KDP, it is a very easy process to get self-published). Just in the last months I have written some 7 whitepapers, ranging from the CMMC to the DevSecOps to Windows 365 restoration.

But there is one area that has eluded me to write, and that is the topic of Identity and Access Management. This is the field essentially where all of the usernames and passwords are properly managed (at least in theory), with the primary purpose of protecting these crown jewels from the hands of the Cyberattacker.

It’s not just simply assigning login credentials and telling employees to follow your security policies, but it has become much more complex than this because of the heavy adoption of cloud-based platforms such as those of the AWS and Microsoft Azure.

From within them, there are a ton of sophisticated tools that an organization can use to manage all of the usernames and passwords of their employees.

One such tools is that of the Azure Active Directory. At the simplest level, you can create various user profiles, and from there, you can assign blanket login credentials to whomever. So, through just one login, your employees will be able to gain access to other shared resources that they need to conduct their every day job tasks.

In fact, the concept of IAM is now becoming a key concern in the Cyber industry, and according to a recent conducted by Cider Security, it is ranked second as the second biggest problem that organizations face when migrating to the cloud. More information about this can be seen here:

But despite the suite of tools that are available from AWS and Azure, trying to gain control of your IAM processes can still backfire for the following reasons:

*There is way too much assumption on part of the business that the IAM structure the cloud provider has to offer will be 100% congruent to what has already been established. Very often, this is not the case, and the painstaking of process of mapping out what goes where needs to happen first.

*After the migration to the cloud, there is often a sense of the lack of “command and control” amongst many IT Security teams into the login credentials of employees, because of the lack of not being trained before hand of what to expect.

*If the IAM is not configured properly, many employees will experience the problem of having to login multiple times in order to gain access to what they need to. Very often, new passwords will have to be set up, thus making the employee having to remember dozens of them, which defeats the whole purpose of IAM all together. This will lead to huge employee frustration, with the end of result of the “Post It Syndrome” reappearing, and in a worst-case scenario, the downloading of unauthorized apps.

*Not disabling accounts after an employee leaves the organization.

*If your employees are frustrated (as eluded to before), another problem is that they will simply start sharing passwords once again, causing even more problems down the road, such as the lack of accountability as to who is accessing what.

My Thoughts On This:

Just as much as you need to create a rock-solid Cybersecurity Policy in general, you need to create the same thing for your IAM platform. You need to take a cut of the cloud platform that you are intending to create and see how well your IAM policies fit into it. In other words, create a sandbox like environment first, and play around with that to make sure all is well before you release your IAM policies into the production environment.

Second, don’t let the security tools that are available in AWS or Azure dictate your cloud migration. They are only there to help you, so you don’t have to spend extra $$$ in trying to upgrade your security tools. Rather, you need to figure out how those tools can fit into your IAM environment. I know that Azure has an entire security center full of stuff, but it is up to you to figure out what you really need and how it will fit in.

Third, make IAM one of your first priorities in anything you do that relates to your IT and Network Infrastructure. Both the authentication and Cyber threat landscape are becoming extremely dynamic and complex, thus by staying on top of your IAM needs and objectives, you will leave less backdoors behind for the Cyberattacker to penetrate into.

Fourth, whatever tools you use (for example in Azure), make sure that you configure them to your own settings. Never rely upon the default settings!!!

Fifth, managing an IAM platform is not as easy as you think. It can become quite complex, depending upon size of your organization, and the kind of cloud deployments that you are intending to have. Thus, don’t be afraid to ask for help. This is where the role of the Cloud Services Provider (CSP) can be of immense help. Not only can they help you with your configurations, but they can manage them for you as well, so that you stay focused on what is most important to you:  running your business.


Sunday, June 19, 2022

The Benefits That Digital Twins Bring To Cybersecurity


In all of the writing that I have done in the past 13 years, there seems to be one central thread:  The Cloud.  Back then, it was all about data storage.  But fast forward to now, and either with the AWS or Microsoft Azure, the world is your oyster. 

It is totally unfathomable with what you can do with the Cloud today, in a way, it is like the technology in Star Trek.  But don’t discount it, food replicators are still a real possibility down the road.

Today, many businesses have made the full transition to the Cloud, but there are still some out there who choose to remain totally On Prem or have a hybrid approach of sorts.  Perhaps there is the fear of having a total loss of control, or the processes that are still there simply cannot moved to the AWS or Azure. 

A typical example of this are the manufacturing/supply chain/logistics industries,

Their processes are so legacy based that it is totally infeasible to move them to the Cloud.  It’s like our Critical Infrastructure.  You simply cannot put electrical wiring into a private Cloud.  So what are these industries to do?  There is still a ton of advantages that they can get out of it.  Well, the answer lies in creating what is known as the “Digital Twins”. 

This is where an organization takes an existing process that they have On Prem, and create an exact (or as close as possible) replica in the Cloud.  So for example, imagine the processes that are used at Boeing to build a 787. 

Take a subcomponent of that, such as mounting the jet engines to the wing.  Through careful design, this process can actually be replicated in the Cloud.  Not too many people have heard of this yet, but it is a growing market, which is right now valued at about $5 billion.  It is expected to grow at 35% per by the time we hit 2027.

Now keep in mind that this kind of technology has existed for quite some time.  For example, the Boeing 777 was the first airplane to be designed totally by computers.  But the tools now have become advanced enough where different “what if scenarios” can be played in the Cloud on existing processes. 

Imagine the water supply of a small town, and the engineers are trying to figure how to redo some of the piping in order to optimize the flow of water.

Well, there is no need to create a model of it in the physical world, this can all be done in the virtual world.  Many different scenarios can be played out in a short period time, and ultimately the best possible configuration causing the least amount of downtime can then be chosen, then implemented back into the real world.  This is now capturing the interest of the Cybersecurity world.

At the present time, an IT Security team can quickly model (using AI and ML) what future threat variants can look like, but they are still left to guess, using their own judgements, as to what the impact could be.  But soon, this guessing will not be required. 

Now, a company can replicate their entire IT and Network infrastructure into their own private Cloud, and get a real simulation of what these new threat vectors could bring to the table.  Think of this environment like a “Super Sandbox”.

From here, the IT Security team will be in a much better position to implement the right security controls with a much greater level of confidence.  Although this sounds great in theory, there are two areas of major concern that need to be dealt with first, which are as follows:

1)     Data leakage:

Whenever you create a Digital Twin in the Cloud, you are essentially creating a bidirectional flow of data.  Meaning, whatever new datasets are harnessed into the On Prem will be automatically sent to the Digital Twin based in the Cloud, and vice versa.  So, there are two things that you need to be aware of: 

*Any tests that you do on the Digital Twin could be transmitted down into the production environment, and cause damage that you were not expecting to happen.  Therefore, you have to make doubly sure that any ripple effects like this are totally eliminated.

*Second, by introducing a new flow of datasets also increases the attack surface that can be easily taken advantage of by the Cyberattacker.  You must also take the time to make sure this bidirectional flow is protected, and is not prone to data leakage.

*Third, now that you have datasets in two different environments, which means you need to have twice the amount of controls in order to mitigate the chances of any sort of data leakage from happening.  This can cause an increase in the amount of overhead that is required to keep both environments safe from one another: “Protecting the digital twin itself is as important as protecting the system it analyzes”


2) Many mistakes could still be made:

               Although the actual production environment may be working fine, trying to build an exact     replica of that in the Cloud still may not be feasible yet.  Let’s go back to my earlier example of         the optimization of water flow.  While this is theoretically possible to do, building out such a          large-scale replication will not only take time to accomplish, but mistakes could also be made as well, if the original blueprints are not made available.  For that reason, many advocates for the       Digital Twins concept (such as the Digital Twins Consortium, the link to that is: highly recommend that an IT Security team start small,         then build up there, as the processes become more known as established.

My Thoughts On This:

There are two things to keep in mind here about the concept of Digital Twins:

*We are only still in the beginning stages of it – as many experts predict, we are still at least 15- 20 years away until it becomes mainstream into our society.

*The Digital Worlds is not meant to be viewed as a just a one point in time picture – rather, it is dynamic in nature, and it should also grow in time in time to its fullest degree in order to realize the full benefit of it.

Surprisingly, I like the idea of the Digital Twins, especially what it means for Cybersecurity.  But my question is now: Ultimately is the world going to be totally digital, where everything is represented as objects in the virtual world?  The  Metaverse is pointing towards that direction.

In other words, are we going to lose our own, physical identities and be branded by some sort of Avatar?  The answer is yes, and to me, that is very scary.

Finally, to download a full report on the Digital Twins from Capgemini, click here:


Saturday, June 18, 2022

4 Reasons To Rethink Purchasing An NFT


Two years ago, right before the COVID-19 pandemic hit, the world was all awash in new digital technojargon.  Things such as blockchain, virtual currencies, Bitcoins, Ethereum were just starting to pick up steam. 

But as the Remote Workforce took a permanent hold on our society, these technojargons became a reality to everybody.  For example, the Bitcoin was all the craze, and pretty much the entire world saw how you could make a quick profit from this, as well as huge losses.

The blockchain was the next big technojargon to come out, because the virtual currencies resided on this platform.  Now with the normal financial markets tanking, the value of Bitcoin has also dropped off greatly, as people now have to do deal with the issues of using real currencies. 

But as the markets go up, it is also expected that some point in time, the virtual currencies will also.  Maybe not so fast as it did before, but it will.

So until this happens, there is yet another new digital term that is now picking up the lag.  This is what is called the “Non-Fungible Token”, or “NFT” for short.  Technically, it can be defined as follows:

“An NFT is a digital asset that represents real-world objects like art, music, in-game items and videos. They are bought and sold online, frequently with cryptocurrency, and they are generally encoded with the same underlying software as many cryptos.”


The concept of NFTs is really nothing new, and in fact it has been around since 2014.  It’s first public attention when a digital collection of artwork created by an artist known as “Beeple” sold for an almost jaw dropping $70 million.  So remember the Mona Lisa by Picasso? 

It’s digital representation can  now be called a Non-Fungible Token (I am going on the assumption that is out there on the digital superhighway).

So really in the end, NFTs are nothing but digital representations of objects that exist in the real world.  It is a great way for people with $$$$ to get access to physical items, but in its virtual form.  But as the key differentiator here is that NFTs can only be purchased with a virtual currency, such as the Bitcoin.  Traditional currencies won’t work in this market arena.

At the present time, the NFT market is worth well over $41 billion, and is expected to grow in the near future.  Now that you hopefully have a little bit better understanding of what an NFT is, a future blog will go into all of the technical details of how the whole process works. 

Also, keep in mind that using an NFT is a great way for the creative class (such as artists, musicians, novelists, etc.) to protect their original works of art.  Also, it gives the owner the bragging rights that they own the digital version of an original creative piece of work, which makes NFTs so expensive.

But with the good comes the bad.  Because of its rapid growth and market value, NFTs are now also becoming hot prey for the Cyberattacker, just as much as the Bitcoin was (the Cyberattacks that occurred here are known as “Cryptojacking”).  How is this happening, you may be asking?  It can through seven different ways:

1)     Impersonation:

This has proven to work quite effectively with Social Engineering and Deepfakes, but in the world of the NFTs, this is becoming a problem.  For example, a Cyberattacker a can easily set up a phony website, using the name brand of popular sports figure, for example.  From here, fake NFTs can sold through an account that the buyer has to set up.  But rather than getting the NFT, they are simply transmitting their login credentials that can be used for a subsequent attack (such as ID Theft), or which can be sold on the Dark We,

2)     Illegally produced NFTs:

Even though NFTs are protected by encryption to some degree, it is still not enough.  If a person knows what they are doing, the NFT can be easily replicated, and made into a counterfeit.  Unfortunately, the copy right laws are extremely murky here, so if you want to take somebody to court and sue them for the illegal replication of an NFT that you claim is originally yours, good luck.  The courts are not equipped to handle these kinds of las yet.

3)     Phony platforms:

Just like virtual currencies, creating a brand new NFT platform is quite easy, as long as you can get your hands on the source code to do so.  With this in mind, it is very difficult to know what is for real and not, because again just like the virtual currencies, NFTs are an unregulated marketplace.  So if you feel that you have been a victim in this regard, there is very little that law enforcement can do, at least for the present time.

4)     A currency that cannot be traced:

By its very nature, virtual currencies cannot be tracked down easily, because it is digital.  And since this too is an unregulated market, there are no audit trails that can be built to track the Bitcoin (for example), unlike the traditional currencies.  So if you feel that you have been scammed for paid for a fake, you are going to have a very difficult time getting your money back.

My Thoughts On This:

I have always wondered what NFTs are, so I am finally I got to write something about it today.  There will be more articles about this in the future as well, but the next one will give you the technical overview of it, as promised earlier.  Honestly, I wonder what the craze for NFTs is really about. 

Sure, you may own the digital version of some creative work, but to me that is not owning the original. 

I would much rather own the latter, as I feel that you gives you more bragging power, and you can probably even sell that for a higher price in the marketplace than you could for the NFT version.  I really can’t say at this point just how far NFTs will go. 

There is the potential they could go on for a long time, but also crash and burn like the .com craze of the late 90s.

But keep in mind that those items that are worth having the most are those that you can hold, touch, and feel.  But that is my two cents worth.  While the world is going digital, make sure to take the time to enjoy what is in the physical also.

Wednesday, June 15, 2022

How SMBs Can Meet Cyber Liability Insurance & Compliance Requirements With Judy


As the Cybersecurity Landscape grows more complex day by day, and as new threat variants seem to evolve which are even more dangerous, SMBs in particular need to get some sort of financial support in case they become a victim.  One area that they can get this is in is in having a good Cybersecurity Insurance Policy.

Keep in mind that getting a Cyber Policy is not the same thing as getting car insurance.  The process can take a lot longer, and there are more things that an SMB owner has to do in order to be even considered as an applicant.  For example, they have to fill out an extensive questionnaire to prove that they are compliant with all of the controls that they need.

In today’s podcast, we have the honor and privilege of interviewing Raffaele Mautone and Jason Myers, the CEO and Chief Product Officer of AaDya Security, Inc. to talk more in detail as to how an SMB can get a rock solid Cyber Insurance Policy.  You can download the podcast at this link:

Sunday, June 12, 2022

From The RSA Conference: What The CISOs Are Saying For 2022


Just last week, the RSA Conference was held.  This is the biggest Cyber gathering on a global basis, where pretty much every vendor from the sun comes out, sets up a booth, and showcases their latest products and solutions. 

Keep in mind that this is not an event for just the larger Cyber companies, but even the startups are welcome to have a booth their as well.  It is also a time when the leaders in Cyber, not just from the business community also come out to share their thoughts and ideas.

This year’s RSA conference was a very special one, because this the first face to face one that happened in two years, ever since COVID-19 struck.  With so many people attending, a lot of attention was paid to the CISOs there, and were asked what their thoughts were as we now come in the second half of the year.

So what is concerning them for this time period?  Here is what was discovered:

1)     The lack of workers:

While it is a known fact for quite some time, CISOs for the first time I have seen have actually disclosed the fact that they are worried about filling in their empty spots.  Although this feeling was echoed by a many of the SMBs, those that have 50 or fewer employees are really feeling the pinch.  The companies polled were also concerned about employees maintain a strong level of Cyber Hygiene, and supply chain attacks, such as the one illustrated by Solar Winds.  Even the vetting process used to find the right third party to work with is a strong concern, especially for healthcare organizations.

2)     The movement to the Cloud:

With the Remote Workforce now taking a permanent fixture in Corporate America, many businesses are now moving to the Cloud, 100%.  Meaning, they are getting rid of being On Prem and now adopting a Private Cloud or even a Hybrid Cloud infrastructure.  But interestingly enough, it is the SMB that is taking the lead here, not the bigger companies.  For example:

*75% of the SMBs (those with less than 50 employees) have either made a full migration to the Cloud, or are planning to. 

*Only 13% of the larger businesses (those with more 10,000 employees) have made a full adoption to the Cloud.

Not surprisingly, software security, especially those involving open-source APIs are a top concern for the CISO (at 62%), and the implementation of DevSecOps (at 54%). 

Also, the reason why the larger companies have not totally migrated to the Cloud yet is that they still have a lot of legacy infrastructure that has to get moved over.  Since they have larger balance sheets than versus the SMB, they can afford to wait in order to take the big plunge.

3)     Cybersecurity Insurance:

Not surprisingly, this is a need that many CISOs echoed at the RSA Conference.  But the also admitted that they are having a much harder time getting a good policy, because of all of the compliance checks that are now being demanded by the insurance carrier.  Another huge impeding factor is the fact that covering Ransomware payments is no longer being included in many policies, along with escalating premiums because of the rise in inflation.  In this regard, the insurance carriers are also being blamed for making blanket requirements, without assessing the true security environment of an applicant.

My Thoughts On This:

Some good news here is the 74% of the CISOs polled think that they will see an increase in their budgets in the second half of this year.  But on the downside, only 24% are making use of Threat Intelligence.  This is quite surprising, since there are many automated tools out there that can help not only analyze but even predict what the future holds.  This is an area which needs to be paid attention to very closely.

In the end, the one question that did not get asked is how long the traditional role of the CISO will last.  IMHO, the days of hiring a traditional CISO with a great salary, perks, benefits, and stick options are now coming to an end, most likely this year.  Many businesses are now starting to understand the value of vCISO.

Saturday, June 11, 2022

Slack Is Becoming The Dominant IM Tool - But It Is Not The Safest To Use


As I was out walking last night to take a break from looking at my computer screen, I had a rather long conversation with a neighbor of mine.  We talked about what we do for a living, and he said that actually does the geometrical layouts for the railroad tracks here on the UP-W in the western burbs of Chicago.  Naturally, my ears perked up . . . math? 

I’ve got to hear about this.  He said that there was really nothing that you had to understand about geometry to do his job, as he said everything was done on the computer.  But still I said, you have to have some background.

So the conversation went about as to what I did, and after that, we brought out a book that we has reading, which was all about the Web 3.0 and the metaverse.  I told him that it was amazing how far we have come in terms of technology, and what more is yet to come. 

Then I posed to him this question:  How did we do it in the 80’s and 90’s when there was no Google or wireless devices?  He gave out a very candid answer:  We were forced talk to each other face to face, and communicate that way.

That statement got me really pondering, and kept me up for the good part of the night.  Really, how did we do it?  What was the first form of digital communications that we used?  I remember getting my first cell phone back in 2001, but the first true digital communications I have ever used was Yahoo Messenger. 

I used it for a very long time, as well as my other geek friends, until Yahoo discontinued it a few years ago.

Why they ever did that, I don’t know.  It was really a great tool, albeit clunky.  Well, since then, hundreds of other Instant Messaging (IM) tools have propped up, of many now come bundled with Zoom, Microsoft Teams, Web Ex, etc.  The only standalone IM tool that I now of which is available is what is called “Slack”.  I used that for brief stint I had as a Proposal Writer.

My first impressions of it were that it was rather cumbersome to set up and use at first, but when I was forced to use it (as you can tell, I am not a lover of technology) it seemed to OK.  I really saw no difference from it as opposed to Yahoo Messenger, but it came with a lot of bells and whistles.  Honestly, I would much rather use the IM chat agent in Teams than Slack.

But one thing I did not realize, which was until today.  While many IM platforms are built off proprietary technology, Slack has actually been created using an Open-Source platform.  While this is a good thing, it can  also be a bad thing as well, because now the Cyberattacker has a much easier way to get in. 

Although I am unaware of any large-scale attack against Slack, the following are some examples of what could possibly happen:

*It is quite easy to send spoofed messages to lure in unsuspecting on the receiving side (other IM platforms that I know of at least send you some kind of alert if a message looks suspicious in nature).

*It has a ton of public channels that go unmonitored, this is also a wide-open terrain to penetrate into.

*Slack allows you to create your own customized apps to meet your IM requirements. But the problem here is that you have to use their APIs, which are largely open sourced.  Therefore, nobody has really checked if the source code is secure, or even upgraded with the latest patches and upgrades.  This can trigger yet another large-scale supply chain attack, perhaps even greater than that of Solar Winds.

*The log files that are kept can only be accessed by the owner of the account.  The IT Security team at Slack has no access to this (which is really surprising), and once any conversations are deleted, they are gone forever (though there is a way to get them – nothing is ever truly deleted in the digital world).

*Many of the other forms of authentication and authorization (such as MFA) are only available in their Pro plan and on up.

So now, this comes to the question:  Given these flaws of Slack, should you not use it anymore, and use something else like Zoom or Teams.  Ultimately in the end, the decision is yours and what will work best for your team in the end.  But if you do continue to use Slack, keep this pointers in mind:

1)     Clearly define the public and private channels:

For example, if you have to have a proprietary meeting with your software development team, then the choice is obvious:  Use a private channel. Make use that this channel is using an SSL means of Internet connection, and only restrict your meetings to those who have exclusively been invited.  Any body else should be booted out, without question.  Also, any sensitive material should be shared on a private channel, not on a public one.

2)     Keep apps down:

Yes, apps make our lives a lot easier (or at least we like to think that they do), but given the open-source nature of Slack, keep your app development for usage of it to the barest minimum possible.  If you have to create any apps using Slack APIs, make sure you test the final product in a sandbox environment, and that APIs used have been upgraded with the latest patches.

3)     Backup, Backup, Backup:

This is probably one of the oldest mantras spoken in the world of Cyber, but it is so true. Backup everything, most especially those conversations that you are having.  In this regard, consider using full, incremental, and differential backups.

4)     Allow additional layers of security:


This means that you are going to have get a paid plan.  When you get this, you can then throw the additional layers of security, most notably that of encryption.  But keep in mind that at first glance, the pricing for Slack seems to be very reasonable, which can be seen at the link below:

My Thoughts On This:

I forgot to mention this earlier in this blog, I also use another chat mechanism, when ever I have to communicate with my ISP for any issues.  I am forced to here, as that is the only, they will communicate (other than Email).  I look at this way:  If you are going to spend 30 minutes on Slack, why not just simply call the person on the phone and go over what you need to with them?

But then again, I am very old fashioned in my ways . . . .

Sunday, June 5, 2022

Are We Pushing It To The Societal Limits With Edge Computing? 5 Realisms To Consider


Whenever you open up an account on either the AWS or Microsoft Azure, you get a whole, wide world open to you that were once thought to be unfathomable.  In fact, I was just talking about this with an old professor of mine last night. 

I did my thesis with him when I was doing my MBA, and back then Windows NT and Netscape were some of the biggest technology platforms out there.  I asked him point blank:  Did we ever think that technology would evolve to the point where it has today?

Of course, his answer was no, as mine would be as well. Back then, creating an Oracle database On Premises would have easily cost around $30k.  Not to mention that their licensing was almost impossible to understand.  But now, with just a few clicks of a mouse, and five minutes, you can create a hosted Oracle Enterprise Database server for just around $70.00 a month or so.

Now with these Cloud juggernauts, you can access shared resources and datasets whenever and wherever you may be at the world.  Heck, you can even create your own Virtual Datacenter and have that stored across different regions in the world as well. 

But as all of this advancing is forward, so do our needs and wants.  In other words, we can’t be happy with what we have, we always want more and more, which is simply human nature.

A perfect example of this is what is known as “Edge Computing”.  There is no doubt that the AWS and Azure can handle gargantuan amounts of data (we are talking about Petabytes here), and they are very quick in processing.  But now, we want this to happen even faster. 

The basic premise behind Edge Computing is that the Virtual Machine (VM) that stores and processes your data will actually be closer to the sources of data that is being fed into it.

The idea with this is that the transaction times for your data queries will be even faster, and you will get what you need even quicker as well.  The concept sounds relatively simple, but it can be complex to deploy, given whatever your requirements are.  The technology to do all of this is still relatively new, so along with that, comes the security risks.

I came across an article that actually addresses these fears, so I wanted to share that along, in case your company has either implemented it, is planning to do so.  Here we go:

1)     Malicious payloads can be inserted easier:

Web applications have always been a favored target for the Cyberattacker, especially when it comes to the backend, primarily the database.  From here, the hacker can then inject and launch SQL Injection Attacks, in an effort to capture the PII datasets that are stored in them.

2)     The attack surface is increased:

This line of thinking is normally associated with IoT devices, because as you add more devices, the interconnectivity grows even more.  Although with Edge computing the gap between the VM and the data feed has actually become narrower, in some respects, it has also been widened as well.  This also makes it just as vulnerable to Cyberattacks, if not more than with IoT devices.

3)     Routing Attacks could increase:

When the data source is close to the server which is processing it, this makes it even mor tempting for the Cyberattacker to try to conquer.  The primary case for this is that many businesses still neglect to fortify their endpoints even in the most traditional of deployment models, so the chances are even greater of unprotected endpoints with Edge Computing.  Not only will it be easier to capture the PII datasets, but there are even greater probabilities that further tampering with the network flow of communications will occur due to the close proximity of the data source to the VM.

4)     DDoS attacks will still occur:

This is an acronym that stands for “Distributed Denial of Service”.  This is where a server is bombarded with malformed data packets and web requests where it totally drains it of its processing power, and the server crashes as a result.  These are probably some of the oldest threat variants in the book, but they are still widely used by the Cyberattacker in different flavors. But once again, by having the data source so close to the VM, it will be far easier to launch these kinds of attacks, but in a rapid-fire succession.

My Thoughts On This:

So now you might be asking, how can you mitigate the chances of happening to you?  Well actually, there is some good news here, believe it or not.  If you have a Cloud based deployment in which you plan to deploy Edge Computing, your provider will already have a great set of tools that you can use. 

I know may be sounding a little biased here, but my hats go off to Microsoft Azure in this regard.  Although the number of security tools they have in stock may overwhelm at you first, they are quick to install and will keep an eye on your Edge Computing setup on a daily, real-time basis. 

But you should probably consult with a Cloud Services Provider first to make sure that you are planning to use the right tools.

Second, follow an established frameworks available from the Cloud Security Alliance (CSA), the Cybersecurity and Infrastructure Security Agency (CISA), and the National Institute of Standards and Technology (NIST). 

This will help you set up the right set of controls that you need in order to come into compliance with the data privacy laws of the CCPA, GDPR, HIPAA, etc., as it relates to Edge Computing.  The very last thing you want is to face an audit and steep financial penalties.

But now all of this comes to an interesting question:  As a digital society, are we moving way too fast?  Yes, and I will be blunt, I think we are moving too fast.  We need some time to settle down, especially with what we have been through the last two years with COVID-19 and the emergence of the permanent Remote Workforce. 

Really in the end, does getting answers to data queries a few seconds faster really mean anything in the end by using Edge Computing?  Not, it does not. As long as we can access it safely and securely within a reasonable quick timeframe is all that should matter.  We are now even pushing it with the adoption of the Metaverse. 

Is it really needed now?  No it is not.  Let us take some time, slow down, and enjoy the great stuff in technological advancements that we have right now before rushing off to find the new, next great thing.

Saturday, June 4, 2022

What Is Predicted For The Second Half Of 2022? More Supply Chain Attacks


Well, here we are now in the first full weekend in June.  Finally summer is here, and it feels great.  But with the way the time is going, soon we will be complaining about winter approaching.  On the Cyber front, things are moving quickly, but believe it or not, so far, we have not yet had the major catastrophes that were feared after Russia invaded the Ukraine.

But we still have half of the year to go, so anything is still possible.  But as I go through the news headlines, this fear seems to be diminishing somewhat.  In fact, there are articles even out there now that discuss why Russia has not launched a major Cyberattack yet.  More to come about that in future blogs.

I think as I mentioned last week, and on a couple of podcasts I have had recently, the attention is still on Ransomware and whether or not a payment should be made to the Cyberattackers that launch these kinds of assaults.  But, another hot button topic that is reemerging yet once again is that of Cybersecurity Insurance. 

A lot of businesses now seem to be concerned about this, especially with regards that it is more difficult now then ever before to procure a rock-solid insurance policy.  A business owner now has to go through a whole rung of compliance checks and even fill out a self-assessment before their particular is even looked at.  I know this for a fact, as I have seen some of them.  Yes, they are detailed, and take time to go through.

Another topic that has seemed to resurface again is that of supply chain attacks.  I actually have reviewed what this is in previous articles, and in a few podcasts.  But to refresh, rewind back to some time ago.  Remember the entire Solar Winds fiasco?  Well, for lack of a better term, this is a prime example of such an attack. 

Essentially, the Cyberattackers were able to find a small hole in the IT/Network Infrastructure, and at that point, they were able to install their malicious payload.

Once the victims downloaded the needed software patches, their devices and other systems became infected with this malware.  But we are not talking about just a few victims, we are talking about literally thousands of them.  But what is unique about the supply chain attack is that it is not what you would call a one to one (1:1) attack, but rather, a one to many (1:N).  In fact, this is the trend that we will now be seeing as these kinds of attacks happen even more.

In the mind of the Cyberattacker, why waste valuable time just hitting one target, when you can hit a lot more, especially all at the same time?  But keep in mind, the key is finding that vulnerable weak spot.  But there are other reasons for this, and these include some of the following:

*Supply chain attacks allow the Cyberattacker to reap a quicker payoff in terms of financial gain, because so many victims are involved.  And of course, the more of there are, the quicker the ROI will be.  Don’t forget the press attention that this well get.  Remember, Cyberattackers are humans also.  Once they get the attention in the press, they get a high on their ego (for all of the wrong reasons of course), and are thus motivated to launch more attacks.

*Web based applications are becoming the norm today in the Remote Workforce environment.  Because of this huge uptick, software developers are even more pressure to deliver on time.  Because of this, checking for the security of the source code often falls to the wayside. Worst yet, many developers are now heavily relying upon source APIs to develop web applications, and these often go untested or unchecked.

*After COVID-19 hit and the Remote Workforce now being a permanent thing, many businesses have moved entirely into the Cloud, such as the AWS or Microsoft Azure.  But, one thing that gets overlooked is that many enterprises choose a Public or Hybrid deployment, which means that the Cloud resources are shared.  Although technically speaking everybody has their own server instance, things are still shared.  So a vulnerability that exists in one tenant could spill into your environment, thus making a valuable point of entry for the Cyberattacker to penetrate into.

So given all of this, what can you do so that you can mitigate the risks of being a point of delivery for a supply chain attack?  Here are some tips:

1)     Check all assets:

Normally this is done when you conduct a Risk Assessment in order to determine which of your digital assets are the most vulnerable to a Cyberattack.  However, this gets trickier of you are totally based in the Cloud.  For example, you don’t know what could be inserted into your platform.  For example, your provider could have put in a something you are not aware about, or even an employee could have unintentionally deployed something also.  Therefore, you need to keep a 100% of what is coming and going in your Cloud deployment on a 24 X 7 X 365 basis.  Of course, doing this manually would take forever, but technology has now come to the point where this can be done automatically for you.  Or, ask your Cloud provider if such tools are available, and if they can set up a system of warnings and alerts as new things pop up in your Cloud environment.

2)     Keep watching and keep assessing:

Apart from monitoring of what is going in and coming out, you need to keep a constant eye also of what still remains vulnerable.  You can do this by conducting automated Vulnerability Assessments and Penetration Testing exercises of your Cloud environment.  On top of this, you need to keep on top of what is still vulnerable in the way of your digital assets.  In this regard, you make use of what is known as a “SIEM” to give you a bird’s eye view of just what is happening in your Cloud infrastructure, from just one, single dashboard. In this area, I do have to recommend that you make full use of Microsoft Azure. They have great tools that you can use at no extra charge to help you do all of this.

3)     Have an Incident Response (IR) Plan in place:

Even  after you have taken all of the preventative measures, there is still no guarantee that you will not become a victim. In this case, you need to have a rock-solid IR Plan in place that will dictate the sequence of activities that you need to follow immediately to mitigate the risk of spreading even further.  This plan should be rehearsed at least once a quarter, and updates made to the documentation with lessons learned.  There are many templates that are available online that you can use for this very purpose.

My Thoughts On This:

According to the latest from Gartner, supply chain attacks will among in the top 10 for Cyberattacks, and up to 60% of security breaches will happen in this fashion.  More information about these stats can be seen in the following links:

But going forward into the second half of 2022, you may not see another Solar Winds attack.  Rather, it is anticipated that the Cyberattacker will launch much smaller scale ones simultaneously, in order to cause more damage without being noticed too quickly.

Protecting Yourself From The Coming Worldwide Cyber War

  As the world becomes more digital by nature, and the Remote Workforce now taking a permanent foothold here in the United States, security ...