This morning my PC at work with Windows 8 decided to forcefully tell me that my PC will be rebooting in 15 minutes due to updates and a timer running full screen. At least I had a close option to save my work, but I suspect that the timer would keep on going.
This is most intrusive and I am highly annoyed with this tactic. I should be allowed to choose when I am going to reboot for updates and not get forced into it. What if this happened during a presentation in front of an audience? This makes me glad that I never bothered getting Windows 8 for my home PC since I bought Windows 7. I am not happy with some things of Windows 7 either, but that is another matter.
Windows 8 has so many other annoyances that I would like to avoid as well which Windows 7 did not have, but this reboot forcing is yet another intrusive tactic. I understand that this is for security reasons most likely, but if the system was designed properly and not just a face, then this tactic would not have been needed.
Monday, October 14, 2013
Wednesday, September 04, 2013
XAMPP Apache HTTP server port 80 blocked on Windows 7
For testing websites on my local PC I use XAMPP and with it Apache HTTP server which is configured for the standard ports of 80 and 443. Recently I had a rather persistent case of the webserver just not starting because of something already using port 80. This is on Windows 7 64-bit.
This hunt to get Apache started on port 80 was a long story including a weird error in the log of port 443 being used over port 80.
At one point in time the log files did not get populated either. It helped to start Apache using the XAMP provided batch files to start Apache. This actually showed the error including that logs could not be written.
All of the usual suspects were checked and mostly were either not installed or the suspect services had been previously disabled. This included IIS, SQL Server. web publishing and routing services, etc. The list went on. Eventually I killed most services and processes that could have been remotely related to this and still no luck.
The cause of my problem in the end was this which was still ticked:
Allow Remote Assistance connections to this computer
The next day I checked for the same setting on my PC at work where Apache is running successfully with XAMPP on Windows 8 and found that the checkbox was ticked. The funny thing was that I do not remember that the actual service was actively running. Now I wonder why that remote assistance was actually using port 80 to begin with when Microsoft already has two or three other software pieces that use this port and it is standard HTTP. Going to my localhost address with a webserver (default port 80) was useless. I really wonder why someone implemented this.
This hunt to get Apache started on port 80 was a long story including a weird error in the log of port 443 being used over port 80.
At one point in time the log files did not get populated either. It helped to start Apache using the XAMP provided batch files to start Apache. This actually showed the error including that logs could not be written.
All of the usual suspects were checked and mostly were either not installed or the suspect services had been previously disabled. This included IIS, SQL Server. web publishing and routing services, etc. The list went on. Eventually I killed most services and processes that could have been remotely related to this and still no luck.
The cause of my problem in the end was this which was still ticked:
Allow Remote Assistance connections to this computer
The next day I checked for the same setting on my PC at work where Apache is running successfully with XAMPP on Windows 8 and found that the checkbox was ticked. The funny thing was that I do not remember that the actual service was actively running. Now I wonder why that remote assistance was actually using port 80 to begin with when Microsoft already has two or three other software pieces that use this port and it is standard HTTP. Going to my localhost address with a webserver (default port 80) was useless. I really wonder why someone implemented this.
Thursday, August 01, 2013
Beggars are becoming more on the streets of Cape Town
This is an ugly topic, but someone has to say something. This will also affect tourism, investors and other industries, but something has to be done.
The last couple of weeks and even months I have noticed how more and more beggars are asking me for money. This is not just me, multiple other people have told me the same thing:
The beggars on the streets of Cape Town are becoming more!
Since I do not have deep pockets and am paying a lot of taxes, I am asking what is the local and national government doing about this? All that I do see is that taxpayers and commuters now get taxed at least 30% on petrol and this is most likely going up. This in turn affects food prices and eventually jobs which will add to the unemployment problem and even more beggars. The rich are getting richer and the rest just struggles and gets poorer. Top make matters worse, I have heard that electricity prices are now also going up again.
Usually this kind of thing gets followed by yet another squandering for useless things of government funds for which the struggling taxpayer yet again foots the bill.
Obviously it will be a 100% guarantee that nobody in government is going to put some minimal support system in place for these beggars because that is political suicide, a black hole for finance and a huge project which nobody wants to touch with a very long stick. In the end this just stays unmentioned but everyone knows about it. Nothing get acknowledged would be the standard procedure here about something being done.
Add to this mess the issue seen a few weeks ago about the toilet system for at least 80000 houses that was promised and never implemented or implemented in a totally impractical way using a temporary system meant for camping on a weekend and not multiple years of daily use. Obviously people just had enough of this mess and promises but totally under delivered.
Add to this the usual top level fat cats that get paid bonuses and extra bonuses for more or less doing nothing but supposedly performing well, while the people doing the work are slaving away and getting minimal pay and horrible working conditions. Do not forget understaffed and not properly equipped.
Something needs to be done soon before even more people end up on the street.
The last couple of weeks and even months I have noticed how more and more beggars are asking me for money. This is not just me, multiple other people have told me the same thing:
The beggars on the streets of Cape Town are becoming more!
Since I do not have deep pockets and am paying a lot of taxes, I am asking what is the local and national government doing about this? All that I do see is that taxpayers and commuters now get taxed at least 30% on petrol and this is most likely going up. This in turn affects food prices and eventually jobs which will add to the unemployment problem and even more beggars. The rich are getting richer and the rest just struggles and gets poorer. Top make matters worse, I have heard that electricity prices are now also going up again.
Usually this kind of thing gets followed by yet another squandering for useless things of government funds for which the struggling taxpayer yet again foots the bill.
Obviously it will be a 100% guarantee that nobody in government is going to put some minimal support system in place for these beggars because that is political suicide, a black hole for finance and a huge project which nobody wants to touch with a very long stick. In the end this just stays unmentioned but everyone knows about it. Nothing get acknowledged would be the standard procedure here about something being done.
Add to this mess the issue seen a few weeks ago about the toilet system for at least 80000 houses that was promised and never implemented or implemented in a totally impractical way using a temporary system meant for camping on a weekend and not multiple years of daily use. Obviously people just had enough of this mess and promises but totally under delivered.
Add to this the usual top level fat cats that get paid bonuses and extra bonuses for more or less doing nothing but supposedly performing well, while the people doing the work are slaving away and getting minimal pay and horrible working conditions. Do not forget understaffed and not properly equipped.
Something needs to be done soon before even more people end up on the street.
Tuesday, April 16, 2013
Different employment criteria
Recently I was multiple times shocked to hear how some people that I know decide on who to hire. It is about what language the person speaks, he/she has to have had many more years work experiences and diplomas so that the diplomas would have to have been studied for during early high school. Then I heard criteria about absolutely minor non-trivial things about why the person should or should not be hired. This is even because the person might be from a small town or has been to a certain school that the person is not being considered for a position. The whole thing was rather shocking. This is for an organisation of between 20 and 30 people.
From someone who employs over 300 people, I have heard the following advice:
Hire people with the right attitude and do not really worry about skills. He mentioned that if he just hired according to skills, that they left pretty soon after and then additional time was needed to hire and train yet again more people which turned out far more costly than training someone with hardly any skills. This was according to the motto: Skills can be acquired, attitude cannot and bad habits cannot really be removed once ingrained.
From someone who employs over 300 people, I have heard the following advice:
Hire people with the right attitude and do not really worry about skills. He mentioned that if he just hired according to skills, that they left pretty soon after and then additional time was needed to hire and train yet again more people which turned out far more costly than training someone with hardly any skills. This was according to the motto: Skills can be acquired, attitude cannot and bad habits cannot really be removed once ingrained.
Thursday, March 07, 2013
Why it is important to get your website working well on mobile devices
Being a small business owner as well as a website builder (mostly technical) and owner of an entry level Smartphone and tablet PC has taught me important things about building a company website:
- Make sure that the core functionality of your website works without having JavaScript enabled. The reasons for this are as follows:
- A lot of people where I live own basic Smartphones which often have JavaScript disabled by default. BlackBerry phones in particular are like this. Most of these people do not know about JavaScript or how to enable it.
- Limited memory capacity: I had a case when I used to own a BlackBerry phone that a website would not open because it was super heavy and the phone ran out of memory. The site had lots of scripts and other busy stuff.
- Device performance when scripts take up too much processing time. This gets so bad that the device responds slowly.
- Let the JavaScript libraries load late, that way the page starts rendering early. Scripts can usually wait a second or two. Some scripts also take a long time to initialise while the user already tries to do something that relies on them to be ready. The best is to not having to require them in the first place.
- Allow the interface to change and re-arrange based on screen size. This also means hiding huge banners if a small screen size is detected. The min-width and max-width CSS options help.
Example for a wrapper inside of CSS:
@media only screen and (min-width: 650px) {} - This is very important and is for all devices: Compress, minimize, cache, etc.
- Let the browser cache images for months since they normally do not get changed once they have been created. The same applies to JavaScript libraries, CSS (once finalised) and other files, but be careful if your content changes, then that should not get cached.
- Minimized CSS, JavaScript and HTML is always good to keep downloading times low. The browser also has less parsing to do if there are no extra spaces. Also consider that people with entry level mobile phones are also not able to afford bandwidth and you do not want to get rid of customers due to slow download on a simple device.
- Get rid if any extra image data like comments, camera details and EXIF data. This information is totally useless unless you have specific image gallery sections that use this data. For interface images, you sometimes get lucky and strip off between 15% to 30% off the file size. Smaller files download faster and are likely to stay in memory longer due to more free space for extra files before being age flushed.
- Make sure the fonts are big enough for mobile devices and the interface looks good on a low screen resolution. Many websites look good when designed on a high resolution, but when switching to a lower resolution, then the ugly parts show up. The other way around is much easier. This also means your button text should not be too long or too short. These interface limitations are very likely to cause clutter and confusion, so you have to possibly break your site into smaller components and check each one individually. These components should be able to flow below each other or do something else on smaller screen resolutions without the user having to zoom in five times, preferably no zooming at all because zooming makes people give up quickly.
- Keep it simple: For search engines your pages should be specific, the same applies to the average user browsing your website. Too many call to actions (CTA's) will just be distracting and are overkill for low resolution interfaces. Enquiry and registration forms should remain simple since anything that looks remotely complicated will let users have second thoughts.
I can recommend a book about web usability that I have read in regards to this: Don't Make Me Think
Thursday, February 28, 2013
Another bad behaving IP address to block
Yet again there is some bad bot behaviour on my website. This time it comes from the IP address: 14.18.25.69
The behaviour seems to be a bot due to trying to use the external URL from Google Tag Manager as a URL on my website. The 360Spider already got a 418 HTTP response code today due to its nonsense.
Based on my quick research, the traffic comes from China. See:
http://myip.ms/info/whois/14.18.25.69
http://myip.ms/info/whois/118.85.207.18/k/3309037677/website/center.189.cn
The behaviour seems to be a bot due to trying to use the external URL from Google Tag Manager as a URL on my website. The 360Spider already got a 418 HTTP response code today due to its nonsense.
Based on my quick research, the traffic comes from China. See:
http://myip.ms/info/whois/14.18.25.69
http://myip.ms/info/whois/118.85.207.18/k/3309037677/website/center.189.cn
Tuesday, February 26, 2013
Very suspect website traffic
Adding a mailing script and some logging to the 404 page of my website revealed the scary world of security hacking attempts, spammers, unwanted bots, old links and hopefully no missing files.
The Baidu spider started generating traffic going to URLs that do not even remotely exist on my site which in particular involves sign up, join and membership pages. This was later on followed by some three level deep news URLs and my website is only one level deep. Other bots started becoming an issue as well, but they did not keep going as long as Baidu did. At least Baidu is a big search engine, but the traffic makes me think that someone was using the search engines to scan my site to get access to sign up forms for spamming and other reasons. Since Baidu does not know of the URLs, it was most likely sending the bot to check if the URL is valid. If that is the case then I am happy that I was made aware of these attempts. They all got 301 redirects for their effort.
Another series of visits that I had was a blatant attempt at trying to breach security by looking for various standard updating URLs like WordPress and other CMS systems to try and identify what I use for my website plus possible breaking in. Due to custom system and security through obscurity, all of those attempts failed horribly and the 301 redirect list got extended. The CMS login is now also extended to three passwords and only three attempts to log in.
Another recent case that I had was URL's from other sites where "http://" gets stripped out and the rest gets added to maybe get lucky. My entire website is using full URLs and this was already happening when I used relative URLs.
The user agent string recorded for this was:
Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; MDDR; .NET4.0C; .NET4.0E; .NET CLR 1.1.4322; Tablet PC 2.0); 360Spider
IP Address: 101.226.166.243
HTTP Accept Language: zh-CN
The Baidu spider started generating traffic going to URLs that do not even remotely exist on my site which in particular involves sign up, join and membership pages. This was later on followed by some three level deep news URLs and my website is only one level deep. Other bots started becoming an issue as well, but they did not keep going as long as Baidu did. At least Baidu is a big search engine, but the traffic makes me think that someone was using the search engines to scan my site to get access to sign up forms for spamming and other reasons. Since Baidu does not know of the URLs, it was most likely sending the bot to check if the URL is valid. If that is the case then I am happy that I was made aware of these attempts. They all got 301 redirects for their effort.
Another series of visits that I had was a blatant attempt at trying to breach security by looking for various standard updating URLs like WordPress and other CMS systems to try and identify what I use for my website plus possible breaking in. Due to custom system and security through obscurity, all of those attempts failed horribly and the 301 redirect list got extended. The CMS login is now also extended to three passwords and only three attempts to log in.
Another recent case that I had was URL's from other sites where "http://" gets stripped out and the rest gets added to maybe get lucky. My entire website is using full URLs and this was already happening when I used relative URLs.
The user agent string recorded for this was:
Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; MDDR; .NET4.0C; .NET4.0E; .NET CLR 1.1.4322; Tablet PC 2.0); 360Spider
IP Address: 101.226.166.243
HTTP Accept Language: zh-CN
Wednesday, February 20, 2013
Activate-A Mercedes-Benz tablet device app
Mercedes-Benz in South Africa has launched an app that work on iOS and Android tablet devices. It does not seem to be an option for entry level Android phones since I tried to get it. Part of the app requires checking in at certain locations. The app is supposed to show how the Mercedes-Benz A class car will look which is most likely going to be on certain locations like road, parking lot and others using augmented reality to experience that type of lifestyle. Currently I could not yet test this part due to having to get home first to install the app and then drive to these location.
To install the app, search for "activate-a" (without the quotes) in the app store. The app does require registration and works for Cape Town, Durban and Johannesburg areas.
To install the app, search for "activate-a" (without the quotes) in the app store. The app does require registration and works for Cape Town, Durban and Johannesburg areas.
Saturday, January 26, 2013
Really weird website traffic from Baiduspider
In the past couple of days, my website got some really traffic from the Baiduspider.
This was all for pages that my website does not have which even included Yahoo news articles. My website currently only has one single news page and not a three level deep structure.
What became even more odd is this search for sign_up, signup, link-resources, register, join, member, membership and other files and folders like that. Not a single one of these requests was actually for a page on my website since I had everything else renamed for logging reasons.
The user agent of one of these page not found cases is:
Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/ spider.html)
This was all for pages that my website does not have which even included Yahoo news articles. My website currently only has one single news page and not a three level deep structure.
What became even more odd is this search for sign_up, signup, link-resources, register, join, member, membership and other files and folders like that. Not a single one of these requests was actually for a page on my website since I had everything else renamed for logging reasons.
The user agent of one of these page not found cases is:
Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/
Subscribe to:
Posts (Atom)