Software Updates

Our MISSION STATEMENT and our CODE OF CONDUCT -|- announcements of general interest -|- Questions and Answers about this web site -|- Suggestions from members
User avatar
yogi
Posts: 7502
Joined: 14 Feb 2015, 21:49

Re: Software Updates

Post by yogi »

Thanks for the input, Gary. It was an omission on my part and had nothing to do with file types. Also, if the uploaded picture does not fit the specified parameters for dimensions and file size, an error message will be generate at the time up uploading.
User avatar
forumadmin
Site Admin
Posts: 42
Joined: 14 Feb 2015, 17:39

Re: Software Updates

Post by forumadmin »

We've gone through a few levels of software upgrades since I last posted here. The most recent site wide changes in software was completed yesterday after a series of snafu's with the installation process. As of this post all three of our Brainformation Forums sites are functioning as expected.

A couple new features were installed with the last upgrade. The most obvious one is the ability to include Attachments with the posts. This feature provides a method to embed images and text documents into the content. It's located in a tab a the bottom of the editing screen. Since this is the very first time using this option in all the years we have been in existence there may be some tweaks needed to make things perfect. All the attachments, by the way, will be stored on our server. If the attachments get overly prolific that could slow things down. Thus it would be a good idea not to attach anything unless it can't be explained any other way. Links to images off site are not affected by this new function.

The second new feature has to do with the installation of reCAPTCHA v3. This is an attempt to reduce and/or eliminate robots from trying to sign up for membership. I have to approve all the applications, or discard them, which is a pain in the patuzzie when dozens of bots get aggressive. Unlike all the other CAPCHA schemes we tried in the past Google's reCAPCHA v3 is invisible. It checks registration attempts and logins but there are no visible signs the check is occurring. The exception is if your login attempts exceed 4 tries. At that point you will be interrogated by reCAPCHA and have another 4 chances to get it right. If all that fails, you are locked out for a period of time. All in all, login and registration looks normal, but Big Brother is now watching.
User avatar
Kellemora
Posts: 5158
Joined: 16 Feb 2015, 17:54

Re: Software Updates

Post by Kellemora »

I didn't get logged out from the upgrade, but was presented with the Dark Screen, which made it almost impossible to get back to the white prosilver version. I don't know why so many places are pushing the black screens lately, but I find them to be horrible.

Hope I don't get hit with a Capcha, I LOATHE them!
User avatar
forumadmin
Site Admin
Posts: 42
Joined: 14 Feb 2015, 17:39

Re: Software Updates

Post by forumadmin »

The Prosilver (Dark) version was set as the default. I changed the board default and the guest account both to prosilver as the default style. You should no longer see dark unless you choose it in your profile.

The dark themes you see so much of these days is due to their being easier on the eyes and is basically intended for mobile devices with very small font sizes.

Every CAPCHA scheme I dealt with previously was a nuisance but necessary to keep the bots at bay. The reCAPCHA v3 scheme uses some secret algorithm to calculate a score during specific events, such as signing in. I set the threshold for that score but have no clue regarding what affects it. We've been using the new CAPCHA 24 hours now and not a single bot has tried to register. Of course, things can change quickly. I've not seen any negative reviews for it other than being invisible. Commercial sites have a CAPCHA logo visible when you are filling out forms or registering, but we are running phpBB and those logos aren't showing on our board. I do see it in the admin control panel which is evidence it is working. I also have access to a site that does all the measuring for us so that I can deal with any trends it shows. It seems complicated as is most of what Google does, but I'm only concerned about keeping the trash off our site. So far it's doing a good job.
User avatar
Kellemora
Posts: 5158
Joined: 16 Feb 2015, 17:54

Re: Software Updates

Post by Kellemora »

I'm glad it is working and hiding in the background.
Recently I hit one that showed 4 images and you had to pick which ones were bicycles, which was very hard to discern due to tiny poor graphics. Seems like it did this four times before I could get it.
The Captcha that used the disoriented letters and numbers were a royal pain in the arse, I really don't think they knew what was being displayed so messed up many times.
I left a couple of websites over their Captcha issues.
But if it keeps the bots at bay for you, and is not intrusive, all is good!
User avatar
forumadmin
Site Admin
Posts: 42
Joined: 14 Feb 2015, 17:39

Re: Software Updates

Post by forumadmin »

I've run into those picture CAPCHA puzzles and agree with you that they could be difficult to discern. Be that as it may, the idea behind those pictures is to pass the test more than once (if you pass the test too quickly, by the way, you are considered to be a bot). You can even get some wrong and pass the test. That's part of the algorithm I mentioned above. It does not expect 100% right answers, but they do have to meet a certain threshold to obtain a certain confidence level. The admins are the ones who set the confidence levels, but Google does all the analysis. I'd not be surprised if Google's secret is to merely look at the blacklist of spambot IP addresses. LOL That's one factor I'm sure and why we not yet seen any bogus registrations. The bots are pretty clever and will learn what we are doing soon enough. The proof of the pudding will be a few weeks down the line to see if Google can still outsmart the AI behind the bots.
User avatar
Kellemora
Posts: 5158
Joined: 16 Feb 2015, 17:54

Re: Software Updates

Post by Kellemora »

I may be wrong about this for today, but in the past, log-in bots used to look at the source code you can view from a browser, all they could see was the html, not the php or javascript driving the html display.
Discern the box that needed to be checked, and the location on the screen if necessary to create the necessary clicks.
XHTML/CSS was supposed to stop this way of gaining access to websites, but as you pointed out, they found a way around that.
Websites created using php or other programming to display the html page after the fact was also supposed to stop the bots.
Mainly because the location of the checkboxes were not in the readable part of the html code that was displayed.

Now it seems to me, that with recognition programs, they could easily figure out what is in the Captcha, and which boxes to click on. Or maybe like you said, they only try two of the four boxes and hope that is a passing score to get in.
But every day, new technology is added by those who are doing the evil deeds.
But it does make one wonder why they don't find something more useful to do than just be pests that need to be exterminated.
User avatar
forumadmin
Site Admin
Posts: 42
Joined: 14 Feb 2015, 17:39

Re: Software Updates

Post by forumadmin »

As with most things in life, there are many flavors of what we are referring to as "bots." As far as this site is concerned the majority of bots canvassing our content are related to indexing services or marketing their own goods, which is often porn.

The indexing bots are looking for links we post and following them. They keep lists of all those links and sell them to people who are interested in such things. Believe me, it's a HUGE business; even bigger than Facebook's infamous data collection. While they are using up a small bit of our resources, these indexing bots are fairly harmless.

The marketing bots are the ones that try to post to our boards. Of course they can't do that unless they are registered members. Some bulletin boards, and I have tried this, allow posting by anonymous guests. Since it is possible to set those kind of posts aside for administrative review before they show up on the content pages, I figured it would be relatively harmless to do it. Wrong!!! One or two actually got through the filter and the others just flooded the moderator control panel. That was a terrible idea on my part, so I stopped allowing guest posts pretty quick.

Stopping the guest posts doesn't stop them from trying to register. I tried a few tricks such as making special fields on the registration form. One was a check box saying "I am..." __dead __alive. The first one, __dead, was automatically checked and required the bot to check the "alive" box in order to proceed with the registration. That confused them for a while, but then they figured out how to authenticate a registration request regardless of what was entered into the form. The Administrator still had to approve such authentic requests before the account was enabled, and the Admin e-mail box was filling up rapidly as a result.

Thus the marketing bots essentially figured out how to fool the registration process. The quandary for the Admin then became how to prevent them from attempting registration in the first place without denying legitimate requests. I spent many hours trying to address that problem. The problem is that the bots don't use a single IP address that can be banned. Once one is banned, they use a different one from the block they own. Ban the block, and they move to another block. There was a time not too long ago when more than 500 Chinese bots were simultaneously trying to register and/or post to our boards. We can deal with a few, but 500 caused unacceptable lag for us old timers.

The solution was to not block IP addresses from registering. The .htaccess file for each directory had to be modified to exclude viewing by specific browsers located in specific countries. This approach will still block some legitimate viewers, but it was worth it in that now those Chinese bots don't bother with us anymore.

Recently a bunch of good bots have come to visit us. These are the ones which only index the links and some of our content. They are good in that it helps us show up in search engines when we have the links and content somebody is looking for. However, even these bots will slow us down if they get prolific, unless I use the built in method to restrict what they do. If I can identify a bot, and I can, I can assign them to a bot account. That makes them subject to permissions that I can assign, such as not allowing them to view the private forum for Members Only. By restricting what they can do as members, that also reduces the amount of background noise they generate. So, you will from time to time see bots as Registered Users being online. Those are the ones I've identified as "good guys" and have limited what they can do and not do while viewing our content.

And, as an aside, robot.txt will also impose restrictions, but not all bots honor what is in that file. The bots cannot get around the .htaccess restrictions nor the permissions I impose. Well, not for the time being anyway.
User avatar
Kellemora
Posts: 5158
Joined: 16 Feb 2015, 17:54

Re: Software Updates

Post by Kellemora »

WOW - I read every single word, and with interest too!
I figure if they clicked DEAD and got booted out, they would just come back and click the next box to see what happens.
I ended up with bunch of followers on a certain website over a short period of time. At first I was blocking most of them, especially if they had weird @names associated with them.
Then I though, what the heck, so on another account when it happened, I just accepted them all to see what might happen.
I also had a program for that site that let me know who unfollowed, so I could unfollow them. And if they followed and the unfollowed, they got blocked from following again. Over 3,000 of the roughly 5,000 new followers all unfollowed within a week.
But once they started sending me DM's most with the same text, I found I could block them in a single click, which was great.
Others get a copy of my rules of engagement that tell them if they break any of those rules, they get blocked, hi hi.
So I've whittled down about 30 DMs per day to only 3 or 4 per day, and those are real people with questions.

I know the short time my brother tried hosting his own sales page on his server, the bots nailed him good. Thousand per hour were tying up his system. I think he said it was called a denial of service attack by swamping the system. So he went back to using a hosting service for is new and larger product catalog.

I don't remember without going to look, but I think the only thing in my robot.txt is please follow.

I have no log-in to worry about, so no need for other services to keep the bots out.
I do know certain types of bot, I guess indexing bots, do not trigger the page hit counter.
But some others might, because about once a month I will have at least 200 to 300 page hits on a page that if a real person landed on that page, they would move on to the page it sends them too.

You just gave me a good reason why I've never run my own server, hi hi.
User avatar
forumadmin
Site Admin
Posts: 42
Joined: 14 Feb 2015, 17:39

Re: Software Updates

Post by forumadmin »

In some ways web page bots are stupid. Taking advantage of their stupidity is how to defeat them. That is exactly what is going on with those Google CAPCHA pictures. The bots can indeed recognize the pictures and might be able to recognize the object of interest inside the pictures. But that's it. They can't just check the boxes with the right pictures because that takes some reasoning power that bots don't have. If they get lucky, Google mixes up the pictures in a new order and shows them again. The bot can't figure out the new pattern, and us humans just get pissed but do it anyway. That is the reason why you will never get by with just solving one set of pictures. So it was with my check box that defaulted to "__dead" as a status. All the bots knew was that there were two boxes and a check in one of them. My semi vague instructions did not make sense to them, but it would be intuitively obvious to a human.

The people who write the code for bots are paid huge amounts of cash to figure out the nuances. I took advantage of that fact too. Some bot user could pay to have his code supplier figure out what my question means and which box to check. While that is very possible, it's highly unlikely that anybody else on the Internet is forcing the same kind of choices. Since my unique login was a one off thing, the bots just did what they could (get rejected) and moved on. If 10,000 other websites did the same thing as I did, then the bot owner would gladly pay to have the coder figure it out. And that is what happens if the payout is worth it to the bot runner. Google's reCAPCHA takes advantage of the bots' inability to use logic and its inability to repeat a random task successfully. Humans can do it grudgingly, but it's not so easy to write code for a bot that can do it.

Followers on a given website can be automated. Donald Trump had millions of followers on his Twitter account, for example. Those bots can be programmed to respond a certain way and can create an influence trend. The benefit to Trump to have such a following is obvious, but to you and to me it's a huge inconvenience when those bots are trying to push an agenda we are not interested in. Certain sites attract those kind of harvesting bots and there isn't much you can do as an individual. My answer to bots on Farcebook is to create a misleading profile, but that tactic may not be useful on other sites dominated by bots.

It's said that we only see about 5% of what is going on with the Internet. When you host your own website you get to see most of the rest in the form of robotic information harvesters. There are millions of them out there scanning the web 24/7 and your server logs will fill up quickly if you don't do something to filter them out. That kind of thing is normal in today's world. DDoS (Directed Denial of Service) is even worse. Those are server requests, not necessarily logins, from millions of people (or iOT devices) which are part of a bot network. The sole purpose of DDoS is to shut down the server, but the trolls are there to sell you something. They normally don't want to shut you down. It's just that there are a lot of trolls causing the problem.
User avatar
Kellemora
Posts: 5158
Joined: 16 Feb 2015, 17:54

Re: Software Updates

Post by Kellemora »

I wonder what they gain by getting into websites?
I assume they get to the log-in screen, and answer a question right to get further, but if they are not a member, they don't get past that. Now perhaps they create a profile and become a member, so they can post ads or whatever, but it seems that would get them banned almost instantly.

On most social media sites, the number of followers you have is basically meaningless for several reasons.
Just because the follow you, don't mean they follow your posts, they probably don't even look at the main news feed.
Most folks use Lists and add the followers they want to see everyday to those Lists.
So if a bot is posting something to the main news feed, I would never see it.
For those reasons it appears to be useless endeavor.
User avatar
forumadmin
Site Admin
Posts: 42
Joined: 14 Feb 2015, 17:39

Re: Software Updates

Post by forumadmin »

I must agree that any bot which makes it past the gatekeeper of this site has little chance of doing anything abnormal. A few that I let by just to see what they were up to merely posted links to whatever it was they are selling. It might be profitable for them if the links were click bait and they got paid for each visit, but I've not seen any of that here in our forums. It does seem pointless in our case, but we are not the main target. We just happen to be out there with the rest of the targets which might be more productive.

Just about all the random followers I checked into were claiming to be females engaged in some erotic activity that I could purchase. That's the same group of bots which try to log into our site and post links of the same sort. Places such as Facebook, Twitter, TicToc, and hundreds more like them are targeted for different reasons. There are bots such as the ones I mentioned following POTUS on Twitter and which have no interest in selling any products other than propaganda and influence. The content of those bots can be filtered, of course, but most people don't do it. The content is designed to appeal to a specific audience to influence behavior. In the case of POTUS that influence might be to convince the viewer to vote a certain way. I't's all psychological but very effective. Obviously political figures would like that kind of support, but those same bots can be used for other types of influence as well, such as selling books an author might have written. You know, get half a million good reviews from your bot followers and a lot of other people will see them too. Some of them will buy your book even though they never heard of you previously. It's a great way to sell. But it's not free. The cost of all those followers has to be weighed against the increase in sales.
User avatar
Kellemora
Posts: 5158
Joined: 16 Feb 2015, 17:54

Re: Software Updates

Post by Kellemora »

I know a lot of authors, and other product sellers, who use auto-posting programs.
Many websites where this happens does check for the times these posts are made, and will boot them from the site when they post the same thing each day at the same time, or several times per day.
I had someone I was following who posted each of their four books, 3 hours apart, every single day. Like 1/2 hour apart each post. It didn't take long to see it was a fixed pattern. I didn't do anything, but they suddenly stopped one day. Then a couple of days later they were on another board complaining about being blocked for 30 days, hi hi.

I have a couple of folks who send me an e-mail every few days, but they go straight to my junk folder.
After about a year of this, I finally changed a few so they go straight to the trash bucket, hi hi.
Then there are some very annoying ones I can't send to junk or I would miss those that are important, because they use the same e-mail account for both their garbage and their important stuff.
User avatar
forumadmin
Site Admin
Posts: 42
Joined: 14 Feb 2015, 17:39

Re: Software Updates

Post by forumadmin »

I too make good use of junk mail filters provided by Thunderbird. One of the tricks used by the bots is to send their spam via the contact page of this site. That means all those type of e-mails originate from the administrator's e-mail account and would be a bad idea to filter them wholesale. But, as I said earlier, bots are not all that smart. The text generally has scripted language that appears in almost every one of their spam messages. Thus I created filters looking for specific phrases in the text. Sometimes specific links is what is necessary. The bots won't change their content very often so that it becomes easy enough to filter them by phrases in their text.

Many forums such as ours have specific rules against commercial use. Others don't mind or don't specify their policy. Twitter, Pinterest, and Tumblr, for example, live off the ads and thus encourage them. The only change being made at Twitter now is that the author of the content can be paid directly. I'm not sure how Twitter gets its cut, but I'm certain they do.
User avatar
Kellemora
Posts: 5158
Joined: 16 Feb 2015, 17:54

Re: Software Updates

Post by Kellemora »

I would have to go back and check my filter list, but I think I have some lines in it about having to click here to see message. Actually a few lines similar to that too.

I would say about 1/3 of my personal book sales come from blurbs I place occasionally on Twitter.
I've never had to pay them one red cent.
Post Reply