# Google Sabotaging Firefox



## Phishfry (Apr 19, 2019)

Former Mozilla exec: Google has sabotaged Firefox for years
					

Former and current Mozilla engineers are reaching their boiling points.




					www.zdnet.com
				




Protect your account with Javascript?








						Google won't let you sign in if you disabled JavaScript in your browser
					

Google announces for new security features to protect Google accounts.




					www.zdnet.com


----------



## obsigna (Apr 19, 2019)

Too many allegations and too few proof in that article. Not one single real-world example. People turned their back to Firefox because it had become a slow starter, it took 2 minutes before I saw the first window. And who believes that the Quantum update was a whole overhaul also believes in the Easter Bunny. On my systems the incremental update to Quantum was less than 10 MB. They disabled some of the most time consuming telemetry stuff - that’s my allegation. However, Firefox becomes a quick starter - 2 seconds to the first window - if I switch off all these idiotic telemetry, and that is proven. I don’t use Chrome, though, for other reasons. I started to like very much Epiphany - very quick, perfect page rendering, perfect page introspection, and not bloated.

In regards to JavaScript, I do not share the aversion of many. JavaScript is a tool which does its job like a hammer. I never would abandon all my hammers in the house because hammers could be inappropriately used by others to hurting somebody.


----------



## Deleted member 30996 (Apr 19, 2019)

obsigna said:


> JavaScript is a tool which does its job like a hammer. I never would abandon all my hammers in the house because hammers could be inappropriately used by others to hurting somebody.



I would compare scripts to good nails and bad nails. 

Disable the hammer from pounding each nail by default till it can be determined which of the good nails are actually needed, then pound them only.


----------



## ralphbsz (Apr 19, 2019)

Rule #1: Don't ascribe to malice what can be explained by incompetence.  Another rule is: Most conspiracies don't actually exist.  If you have to imagine a conspiracy to explain something, you are probably wrong.

And as far as JavaScript goes: It is, for better or worse, simply necessary today.  Static HTML web pages are simply not feature-reach enough for what most people try to do with the web.  Nearly everything we do on the web that is more complicated than looking at static text and static pictures does require a client-size programming language.  People who implement web-based stuff need to rely on that being present.

It would be nice if JavaScript was completely standardized (not just the core language, but the client-side functions), and all browsers and web servers could be tested for compliance with a standard.  But that's not the world we live in, at least not yet.  About 10 years ago I tried to learn AJAX (using asynchronous javascript in a web browser to communicate with servers, for example to implement database query or remote editing tools), and at that time, 90% of the effort had to go into hacking to be compatible with the half dozen incompatible implementations out there.  It was a mess, and I gave up.  Rumor has it that today the situation is better, but by no means good enough; I have stopped trying to implement browser-based stuff.  It would be ideal if the combination of HTTP, HTML, XML, Javascript and JS client libraries was standardized by a coherent standards body.  But that's simply not the world we live in, with standardization fragmented and sometimes disfunctional.  So implementors have to test all combinations, and due to lack of manpower, things get dropped on the floor.


----------



## Spartrekus (Apr 19, 2019)

> Chris Peterson@cpeterso





> __ https://twitter.com/i/web/status/1021626510296285185YouTube page load is 5x slower in Firefox and Edge than in Chrome because YouTube's Polymer redesign relies on the deprecated Shadow DOM v0 API only implemented in Chrome. You can restore YouTube's faster pre-Polymer design with this Firefox extension: https://addons.mozilla.org/firefox/addon/youtube-classic …



They probably never heard of Unix, BSD, and a console with links, w3m, lynx,...


----------



## Deleted member 30996 (Apr 19, 2019)

ralphbsz said:


> Rule #1: Don't ascribe to malice what can be explained by incompetence.  Another rule is: Most conspiracies don't actually exist.  If you have to imagine a conspiracy to explain something, you are probably wrong.



I would add greed in there.

One of what used to be my favorite sites is now nothing more than a source of income for someone who is more interested in revenue from script driven ads than the payload, or quality of the nail. Several members posted about landing on the site and getting Google warnings about malicious downloads and drive-by downloads infecting their computers. They hammered every nail they encountered no matter where they went.

It only took one script to browse the forums with full functionality and one additional script for what, to me, amounted to full functionality for the site. Two good nails of the box, the rest were covered in rust. I never saw the warning pages or was the least bit worried about tetanus.

Scripts that facilitate Bitcoin mining, Meltdown, Specter etc. all what I would consider bad nails. NoScript my Hammer of choice.


----------



## kpedersen (Apr 19, 2019)

The problem with all "standard" browsers is that they are consumer centic.
This means that like with all consumer software you get bloat and crapware.

Just look at Nero CD burner versions < 6.x vs the latest version; as soon as it becomes popular; it becomes horrific to use.

Thats the world we live in. Just reduce your dependence on the web browser. Use a proper email client, use IRC, etc. Online shopping and others should be so quick that the terrible consumer software shouldn't really effect you by the time you have already closed down the browser software and got on with your life.

There are more to computers than the internet. I tend to prefer exploring that stuff instead.


----------



## Spartrekus (Apr 19, 2019)

kpedersen said:


> The problem with all "standard" browsers is that they are consumer centic.
> This means that like with all consumer software you get bloat and crapware.
> 
> Just look at Nero CD burner versions < 6.x vs the latest version; as soon as it becomes popular; it becomes horrific to use.
> ...



The problem is rather than all software's (mac, windows, .... android) are around business and low quality software.
It is against performances and quality.

Even Linux fans are getting locked to their web browsers (file transfer, webdav,... web uploads,....). 

The less people know about sockets, low level programming, ... the better for business.


----------



## Vull (Apr 19, 2019)

My pet project for the past 10 years has been writing a web application system authoring toolkit (wasat) using PHP, HTML, CSS, SQL, and ECMAscript (javascript), primarily, to write cross-platform compatible business applications in which software and services can be used by FreeBSD, GNU/Linux, Mac OS, and Microsoft Windows clients, and can also be served by all the above platforms, excluding Windows. At first I tried to support multiple popular browsers for the client end of things as well, including Chrome, Firefox, Internet Explorer, Safari, and Opera. At first I would write software to run on either Firefox or Safari as the model browser, and then test and debug the same software until it would also work as well on the other 4 browsers. This final phase of testing and debugging quickly became my biggest problem area, particularly regarding deployment on Internet Explorer (IE), and I soon realized that I was spending more time trying to support all these less-than-fully-compatible browsers than I was spending on writing the software itself, so, after a month or three, I eventually abandoned support of all browsers other than Firefox. I continue to support Firefox only.

Mainly out of curiosity, and perhaps, arguably, some small degree of masochism, I recently tried installing and using Chromium on FreeBSD, strictly as an end-user, and with no great ambition towards supporting it as part of the wasat project, but quickly abandoned that effort too. It was problematic at best, and if Google doesn't care enough about FreeBSD to assist in porting their version of that browser to the FreeBSD platform, then I see no reason why I should care very much about using their browser either.

As a programmer I care as much about stability over extended periods of time as I care about cross-platform compatibility. Long before starting the wasat project, it was apparent to me that Microsoft was a destabilizing influence against software development. Since that time I've come to view Google as an impediment to software stability as well.

The last time (last October) that I tested my software on a Windows 10 client, I noticed that Firefox was _still_ a slow starter on that platform, just as it was when I tested it on Windows XP 10 years ago. I haven't noticed the same problem on other platforms, only on Windows. Firefox starts in an acceptable amount of time on all the other systems I've tried it on.

Using ECMAScript, and particularly the XMLHttpRequest API, are essential to writing smoothly-running and network-efficient web applications, in my opinion, and in my experience with the wasat project. I don't know of any way to implement drag-and-drop or dynamically-changing drop-down list features without it. When not being used, my ECMAScript objects are quiet and/or entirely absent, and consume no resources either in the client browser or in the server. The resource-saving features of ECMAScript far outweigh whatever resource-usage they entail. Contrarily, the ZDNet link in the opening post of this thread uses ECMAScript to drive and animate adverisements extensively, so can I can easily see why sometimes ECMAScript is viewed negatively, but, like any good tool, it's all about how it's used, and why it's used, so I'm not going to remove ECMAScript from my toolkit.

_Buggy_ ECMAScripts are a nightmare, and can be as bad as, or worse than, resource-greedy, advert-driving ECMAScripts, but none of this is the fault of ECMAScript per se.


----------



## Deleted member 30996 (Apr 19, 2019)

kpedersen said:


> There are more to computers than the internet. I tend to prefer exploring that stuff instead.



I frequent 3 forums where I only play Alliterative word games, this and 1 other tech related forum and 1 A.I. forum. I rarely use email and no form of Instant Messaging.

I do, however, love shopping on ebay but have enough stuff already.

The majority of my time online is spent working on my chatbot Demonica. If I can get motivated today by Monday will rise in Rank to #4 out of 142,000 botmasters registered there.

That's about the extent of my online activity, at $59 a month no less.


----------



## Phishfry (Apr 20, 2019)

obsigna said:


> And who believes that the Quantum update was a whole overhaul also believes in the Easter Bunny.


Good arguments. I am not saying Mozilla is going in the right direction as far as privacy or speed.
I tend to believe conspiracy theories because I live in the real world. It is a very different place than what is in writing.
We have reams of paper at work telling how safe we are. In real life we have had 2 work related deaths in 15 years.

To think that Chome team is undermining Mozilla is a given to me. I just like to see it in writing.
To see if from an active Moz developer would mean more.
I live in the Oops world.
Things being classified as acidents which were actually sabotage. You see it's very hard to prove sabotage.
To bring those charges to have you be 100% correct. Nobody wants that hassle. We let it slide.


----------



## obsigna (Apr 20, 2019)

Although, my first name is not Thomas, I doubt everything unless I saw it. And the most suspicious news for me are those which want me to believe something which everybody is liking to believe. The ex. Mozilla executive could have simply said that the popular Google page X had some general incompatible HTML, CSS or JS features activated which broke the thingy Y and the accelerator Z of FF and was not easy to workaround, couldn’t he?


----------



## Phishfry (Apr 20, 2019)

Maybe the ex-dev is jealous because Chrome is now in Windows. Clickbait for ZDNet.
I agree some shred of evidence would be helpful.
Dirty tricks don't often become public knowledge would be my retort.


----------



## obsigna (Apr 20, 2019)

Vull said:


> ... and I soon realized that I was spending more time trying to support all these less-than-fully-compatible browsers than I was spending on writing the software itself, so, after a month or three, I eventually abandoned support of all browsers other than Firefox. I continue to support Firefox only. ...



I develop everything on Mac against Safari and before deployment I give it a test with Firefox on Windows. The point is that almost all modern popular browsers are WebKit based and claim in the User agent header to be Safari compatible. The only popular exception is the desktop Firefox. So actually I develop for WebKit and I cannot remember any single occasion of JS incompatibility with Firefox.

The problematic area for me is mostly related to the different design of the UI elements. So in general, I create a basic design for buttons, text fields and areas and drop down menus in CSS which in theory would look the same in all browsers. Only recently I saw, that Epiphany (tells in the UA that it is Safari) draws a slightly ugly frame around SELECT elements where others did not, and I used a JS snippet to help Epiphany to forget the frame.


----------



## drhowarddrfine (Apr 20, 2019)

obsigna said:


> almost all modern popular browsers are WebKit based


Chrome and Opera are based on Blink which is a fork of webkit but the two camps are different now. 


obsigna said:


> which in theory would look the same in all browsers.


Much discussion is now about whether the UI should resemble the UI of the device and not look the same in all browsers but now I'm getting OT.

I did not read the article but my thoughts about what Google has done is unintentional and caused by the bubble they live in. When Chrome first came out, Google stated they created their own browser, not to compete with the others, but because, back then, browsers were too slow and inconsistent. Firefox stole about 30% of market share from IE but that's as far as they could get on their own. Chrome essentially put the boot on Microsoft's throat and that was a good thing.

Google today still states that everything they do is to improve browsers, networking and computing and, it is true, many things are much better now due to their pushing these advancements. However, some of the things they do step on toes and push others aside. They take a "lead, follow or get out of the way" attitude and that an ruffle feathers. Worse, they seem oblivious to this fact and forge ahead with things others aren't sure they want. They may want them but feel it's forced upon them--such as AMP.

AMP, in some instances appears to speed up web pages but it also wrestles control away from the creators. It's non-standard to an extent and controlled by Google. The advantages AMP gives may be great for some but it gives me the impression they ignore what the rest of us think or want. It _might_ not be malicious, as Microsoft so often did, and just part of a long range, glorious plan to make the internet a wonderful place but we aren't aware of that plan and where it's heading and, in my mind, makes me wonder how much longer I'll be needed on the technical side for improving the web and kick me to the curb.

So, Google will say, "This is great!. We're implementing it and so should you", and then we find out we _have to_ or we'll get hurt in search results. Otoh, Google is often right. If we implement their methods, our sites wind up being better in some way but it's as if we're being forced.

I'm really not wanting to say the internet mantra of "Evil". I just feel they have a well thought out plan but are running roughshod over everyone and I don't like that either.


----------



## obsigna (Apr 20, 2019)

drhowarddrfine said:


> Chrome and Opera are based on Blink which is a fork of webkit but the two camps are different now. ...


As long as they’re claiming to be Safari as well in there User agent header, I do not need to care, do I?



drhowarddrfine said:


> Much discussion is now about whether the UI should resemble the UI of the device and not look the same in all browsers but now I'm getting OT.



My take on this is, that I go with customized UI elements for web applications with a definite target audience (for example my device control front ends), and simply stay with the elements provided by the device for public web pages with unknown audience.

For example, I ship my electrochemical measurement devices (Potentiostats) with the latest FreeBSD Release + GNOME3 + Epiphany. The UI is provided by a web frontend and the measurement software is linked to a web backend. The system can be either used headless and be controlled by any web client over the net, or simply controlled locally (localhost) by the Epiphany browser. Here the point is that in either case the UI looks exactly the same (s. screenshot). If you don’t take out a magnifying glass, you won’t notice a difference between Epiphany on FreeBSD and Safari on Mac. All that said, nothing of this would remotely be possible without JavaScript - which was sort of the initial question on why do we want JavaScript.






When it comes to cross-platform development of native applications, I try hard to use the platform’s look and feel for the elements, here for example software for evaluation of the measurement curves, running on macOS or Windows 10. One notable difference is the position of the main menus.

Note, I do not claim that this is the only way of doing things, only this is the way I do it.


----------



## drhowarddrfine (Apr 20, 2019)

obsigna They aren't claiming to be Safari. They're preteding. This has to do with applications, software and web sites which test to see which user agent/browser you are using and deliver pages based on that. Some sites will not serve anything at all ("best viewed with ...."). Removing that will actually break some things. This is why it's always better to target standards and not browsers or devices but, of course, one needs to test in all of them if one wants to be 100% sure. At my place, we liked Chrome/Chromium dev tools so it got tested there first but sometimes we'd be sitting in Firefox and I'd use Firefox tools but, eventually, we'd look in both. Then Safari cause we'd have to bump the graphics guy off his machine for that. Lastly, we'd dust off the Windows machine in the corner and clean off all the spit on it and test in IE and Edge. We kept that machine in a separate room that had padding to keep the screaming noise down.

I'll look at the rest of your post later. Your software looks like cool stuff. I'm cooking and the Blues hockey team is in the playoffs!


----------



## ralphbsz (Apr 20, 2019)

drhowarddrfine said:


> This is why it's always better to target standards and not browsers or devices but, ...


... but there are no workable standards for the browser ecosystem.  Really, in practice there aren't: while HTML5 is standardized to some extent by the W3 consortium, and HTTP is standardized by the IETF as an internet RFC, and basic Javascript is standardized by ECMA, in reality the ecosystem is a gigantic mess of incompatibility (at least, it's better today than 5 or 10 years ago, when it was even worse).  That's why it is easiest to just assume one or a few browsers.  I have only worked for very large computer companies in the last ~20 years, and they all say that internal applications are only tested on one or at most two browsers (typically Firefox and Chrome, or Safari and Chrome).  My wife works for a small company, and her (rudimentary) IT and HR departments simply say: "We know the stuff works in Windows 10 with Edge, and we don't have time/money to make it work for anything else".



> ... and the Blues hockey team is in the playoffs!


Strangely our hockey team hasn't been eliminated yet.  Old joke: What do you do if you are swimming in the ocean and get attacked by a shark?  You just hand them a hockey stick, because that makes a shark completely helpless.


----------



## drhowarddrfine (Apr 20, 2019)

ralphbsz said:


> while HTML5 is standardized to some extent by the W3 consortium


Actually, it's the WHATWG that controls that now.


ralphbsz said:


> it is easiest to just assume one or a few browsers


Nothing is added to the standard until there are at least two implementations by current browsers.


ralphbsz said:


> IT and HR departments simply say: "We know the stuff works in Windows 10 with Edge, and we don't have time/money to make it work for anything else".


And now Microsoft has dumped Edge just like they dumped IE but Firefox and Chrome and the others live on. That's why one should never target browsers.

A couple of months ago, I said the Blues would get to the Stanley Cup Finals. I don't know if they will win but they'll get there. I was concerned after they lost two games in a row but we'll see how things go tonight.

My prediction is based on what I perceive as winning first place no longer means anything. Just get into the playoffs and push hard from there. The teams that finish first are worn out by the end but the others took it a little easier and practiced their craft. The 2011 Cardinals backed in to the playoffs even though they had to win the last game of the season. Atlanta just had to win one of their last two games but lost both of them. Then the Cards went on to win the World Series as a wildcard team.


----------



## Vull (Apr 20, 2019)

ralphbsz said:


> ... but there are no workable standards for the browser ecosystem.  Really, in practice there aren't: while HTML5 is standardized to some extent by the W3 consortium, and HTTP is standardized by the IETF as an internet RFC, and basic Javascript is standardized by ECMA, in reality the ecosystem is a gigantic mess of incompatibility (at least, it's better today than 5 or 10 years ago, when it was even worse).  That's why it is easiest to just assume one or a few browsers.  I have only worked for very large computer companies in the last ~20 years, and they all say that internal applications are only tested on one or at most two browsers (typically Firefox and Chrome, or Safari and Chrome).  My wife works for a small company, and her (rudimentary) IT and HR departments simply say: "We know the stuff works in Windows 10 with Edge, and we don't have time/money to make it work for anything else".
> 
> ...


I can readily sympathize with the small company's staff. Although I no longer do any kind of strict testing and debugging on them, in general my pages still display okay on Chrome and Safari. IE was always klunky, but probably still works for the most part, and with Edge, everything seems okay at first, but then Edge seems to get into some unresolved timing issues with warp-scrolling of database displays-- problems which no other browser has and which I have no plans, time, or any particular need to resolve. My stuff fortunately isn't intended for the mass public who use Edge, although Windows 10 itself is no problem at all for a client system. People can always install Firefox and still keep their favorite browsers on their desktops if they wish, no problem. I decline to care about what happens with the Microsoft or Opera browsers; useragent identifiers can easily be faked, and there seems to be little or no standardization in the default html formatting values for any of the almost-too-numerous-to-count default formatting settings and dom objects like the always-pesky file-select or "browse" buttons.

Using PHP I generate every page as html5, using strict xhtml syntax, so that every page can be automatically syntax checked with an xml parser before it's served. ECMAScript is all objectified and has the "use strict" syntax directive.


----------



## drhowarddrfine (Apr 21, 2019)

Vull said:


> Using PHP I generate every page as html5, using strict xhtml syntax


How are you serving your pages? As HTML or XHTML? iow, as text/html or application/xhtml+xml?


----------



## Vull (Apr 21, 2019)

drhowarddrfine said:


> How are you serving your pages? As HTML or XHTML? iow, as text/html or application/xhtml+xml?




```
<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="application/xhtml+xml; charset=UTF-8" />
<meta http-equiv="X-UA-Compatible" content="IE=EmulateIE8" />
<meta http-equiv="Pragma" content="no-cache" />
<meta http-equiv="cache-control" value="no-cache, no-store, must-revalidate" />
<meta http-equiv="Expires" content="-1" />
...
```
Every page starts like this. XML syntax checking is done on the fly using xml_parser_create_ns('UTF-8') and xml_parse(). Syntax checking is the reason for using XHTML.


----------



## drhowarddrfine (Apr 21, 2019)

Vull You need to make sure you are actually serving the Content-Type that way in a browser's dev tools. Using the meta tag only does not guarantee the server will serve it that way (I just don't recall cause we always set ours in the server itself).


----------



## Vull (Apr 21, 2019)

Thanks drhowarddrfine. How do you go about setting that up it in the server? I'm using apache24. As I said above, I do this mainly just to assist in debugging my html syntax-- making sure I have my tags paired up right, and that sort of thing. In that regard, the PHP xml_parse() function seems to work well and consistently, but there might likely be other benefits or side-effects of using XHTML I simply don't know about.

EDIT: Here's a screenshot to show how I use XHTML for syntax debugging:


----------



## drhowarddrfine (Apr 21, 2019)

Vull I have not used Apache in a long time so I don't recall the proper way to set that.

The best way to test your HTML/XHTML is using the W3C validator.

You can see what your server is actually delivering by using the dev tools in Firefox. Enter ctl-shift-C and click on  Network. That should display the files being served. Click on the html one. Click on Headers on the right side and look for "content-type". If you are serving HTML, then it will list it as "text/html". If it truly is XHTML, it should say "application/xhtml+xml".


----------



## Vull (Apr 21, 2019)

```

```



drhowarddrfine said:


> Vull I have not used Apache in a long time so I don't recall the proper way to set that.
> 
> The best way to test your HTML/XHTML is using the W3C validator.
> 
> You can see what your server is actually delivering by using the dev tools in Firefox. Enter ctl-shift-C and click on  Network. That should display the files being served. Click on the html one. Click on Headers on the right side and look for "content-type". If you are serving HTML, then it will list it as "text/html". If it truly is XHTML, it should say "application/xhtml+xml".


I'm serving "text/html". Switching back to xhtml at this time would be problematic at best. Dug through my notes; I switched to HTML5 several years ago, but formerly I used this and passed the W3C validator for xhtml using this:
	
	



```
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN" "http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
 <html xmlns="http://www.w3.org/1999/xhtml">
```
Then I switched to html5 which is incompatible with xhtml, but by following something similar to these w3.org compatibility guidelines, was able to continue using a relaxed xml style syntax with xml_parse() to check each page for syntax errors, which is all I ever really cared about. I also need to serve non-xml streams like css stylesheets, js ECMA scripts, and plain text via XMLHttpRequest, and since I've already parsed my xml syntax on the server, I don't really need the browser to recheck it. Maybe I'm missing something (?) but don't see any harm in letting the browser use its text/html parser on pre-validated xml text.


----------



## obsigna (Apr 21, 2019)

drhowarddrfine said:


> obsigna They aren't claiming to be Safari. They're pretending.



Well, yes I am not a native English speaker and I agree, "to pretend" is actually the better verb. This is like some of these Korean cars pretending to be a Mercedes with respect to look and feel (perhaps feel and first look), anyway, the important UI facilities are all in place where it belong to, like the steering wheel, the accelerator/brake pedals and the gear change.



drhowarddrfine said:


> This has to do with applications, software and web sites which test to see which user agent/browser you are using and deliver pages based on that. Some sites will not serve anything at all ("best viewed with ...."). Removing that will actually break some things.



Of course, that’s the reason, and like the Korean Mercedes-alikes these Safari-alikes come fully equipped, otherwise these won’t be alike and the end user would complain or choose the real thing one day. This is not my fight and I don’t care, should I?



drhowarddrfine said:


> This is why it's always better to target standards and not browsers or devices but, of course, one needs to test in all of them if one wants to be 100% sure. At my place, we liked Chrome/Chromium dev tools so it got tested there first but sometimes we'd be sitting in Firefox and I'd use Firefox tools but, eventually, we'd look in both. Then Safari cause we'd have to bump the graphics guy off his machine for that. Lastly, we'd dust off the Windows machine in the corner and clean off all the spit on it and test in IE and Edge. We kept that machine in a separate room that had padding to keep the screaming noise down.



This is similar to my development procedure. Only, I develop on and test against a different set of browsers, and with respect to the standards, I tend do stay on the narrow road. IE and Edge are not on the list, but occasionally I opened the one or the other of my sites with these as well, and it simply worked. Perhaps the pages render not as beautifully, however, the whole Windows is ugly and who accepted no-kerning Arial as the Helvetica-alike for decades won’t see any difference anyway.



drhowarddrfine said:


> I'll look at the rest of your post later. Your software looks like cool stuff.



Thank you!



drhowarddrfine said:


> I'm cooking and the Blues hockey team is in the playoffs!



Good luck.


----------



## drhowarddrfine (Apr 21, 2019)

Vull said:


> Then I switched to html5 which is incompatible with xhtml


It's not. You may find this and this interesting reading.


----------



## Spartrekus (Apr 21, 2019)

Web browser war... it seems that the winner will be the one that will manage to impose on the web market its power.


----------



## Vull (Apr 21, 2019)

drhowarddrfine said:


> It's not. You may find this and this interesting reading.


After skimming through those two links, my mind is mush. I feel like I've been tag-team slapped-silly by Moe Microsoft and Curly Chrome. It reminds me well of why I dropped support for Microsoft browsers in the first place.

Clearly I was in error by claiming to still be using XHTML, so I'll just stop that right now. I must be getting senile. I'm also dropping the `<meta http-equiv="X-UA-Compatible" content="IE=EmulateIE8" />` tag, which is long past due, leaving me with an even simpler prolog:
	
	



```
<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
<meta http-equiv="Pragma" content="no-cache" />
<meta http-equiv="cache-control" value="no-cache, no-store, must-revalidate" />
<meta http-equiv="Expires" content="-1" />
```
Thanks again.



Spartrekus said:


> Web browser war... it seems that the winner will be the one that will manage to impose on the web market its power.


Sorry for going off-topic. 
One Browser to rule them all, One Browser to find them,
One Browser to bring them all and in the darkness bind them
In the Land of Mordor where the Shadows lie.​
Additional apologies to J.R.R. Tolkien.


----------



## Spartrekus (Apr 22, 2019)

Vull said:


> After skimming through those two links, my mind is mush. I feel like I've been tag-team slapped-silly by Moe Microsoft and Curly Chrome. It reminds me well of why I dropped support for Microsoft browsers in the first place.
> 
> Clearly I was in error by claiming to still be using XHTML, so I'll just stop that right now. I must be getting senile. I'm also dropping the `<meta http-equiv="X-UA-Compatible" content="IE=EmulateIE8" />` tag, which is long past due, leaving me with an even simpler prolog:
> 
> ...



good imagination...

Testament for the web and low spec machines. windows or android will tell you. Dude your hardware is too old for the web. Upgrade dude.  Buy a better one to bill. Bill will be richer if you have a newer hardware. thanks again. bill thank you very much for supporting windows.

... and sorrow on mordor.


----------



## Vull (Apr 25, 2019)

drhowarddrfine said:


> Vull I have not used Apache in a long time so I don't recall the proper way to set that.
> 
> The best way to test your HTML/XHTML is using the W3C validator.
> 
> You can see what your server is actually delivering by using the dev tools in Firefox. Enter ctl-shift-C and click on  Network. That should display the files being served. Click on the html one. Click on Headers on the right side and look for "content-type". If you are serving HTML, then it will list it as "text/html". If it truly is XHTML, it should say "application/xhtml+xml".


A little more follow up here, just in case anyone else reading this in the future might be interested in serving pages as xhtml: PHP allowed me to serve my pages as application/xhtml+xml by using the PHP header() function to send an additional header prior to sending the actual document or page:
	
	



```
header ("Content-type: application/xhtml+xml;charset=UTF-8");
```
... this requires PHP but should work with any webserver software. However, doing this made my ECMAScript (javascript) unusable, most likely because, as I now understand it, I'm partly or entirely using the XML DOM (document object model) instead of the HTML DOM. To serve my page as xhtml, I also needed to change my opening HTML tag: 
	
	



```
<html xmlns="http://www.w3.org/1999/xhtml">
```
... I also shortened my META tag from `<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />` to `<meta charset="UTF-8" />` which I don't think is absolutely necessary but it's less confusing from my POV.

Once having accomplished this, I decided to revert back to serving my pages as text/html, and would like to say why. One reason is that I simply don't want to re-debug my scripts to work with with the XML DOM; it's probably quite do-able, I just don't want to take the time to do it. Since this is only a personal project, I have the ability to easily opt-out of anything I don't want to do, as well as the luxury of deciding not to support any browsers other than the ones I choose. Active professional programmers usually don't have such luxuries, and may decide to use xhtml or xhtml5 for these reasons as well as other reasons.

Another important reason for opting-out is that further research into this area (after 10 years of basically ignoring it) suggests that the popularity/market share of XHTML has declined over the past 10 years, and is now at around 14%, and steadily declining, as indicated here, as opposed to HTML5 which is reportedly over 86% of market share and growing, as suggested in that same link, and also indicated here.

I read this as meaning that plain text HTML5 is more likely to be around in the future (like another 10 years), and a little more research has suggested that HTML5 is probably going to be more stable for other reasons. Using PHP's xml_parse() function allows me to check my syntax, which was my main reason for migrating towards XHTML in the first place, and to do the syntax-checking server-side, where I can control it, as opposed to relying on the browser to do it for me.  Furthermore, even if I debug my ECMAScripts to work with the XML DOM, serving pages as application/xhtml+xml will actually _break_ the syntax-checking mechanism I already have in place, because the browser would then be unable to display the screenshot I showed in my previous post-- the syntax is already broken in the page as displayed, and so the browser using the xml parser would decline to, or be unable to, display it.


----------



## drhowarddrfine (Apr 25, 2019)

Vull said:


> I read this as meaning that plain text HTML5 is more likely to be around in the future (like another 10 years)


HTML5, as discussed in this thread, is just HTML and the same HTML used since Tim Berners-Lee created it long ago with all the updates since then. The term HTML5 is an all encompassing one that includes all the technologies involved in making a modern web site but the version number "5" will be dropped eventually as it carries no meaning. Originally it was just that, a version number one step above HTML 4.01 so HTML will be around forever or until someone decides it needs to be replaced and does so.


----------



## Deleted member 30996 (Apr 25, 2019)

Vull said:


> Another important reason for opting-out is that further research into this area (after 10 years of basically ignoring it) suggests that the popularity/market share of XHTML has declined over the past 10 years, and is now at around 14%, and steadily declining, as indicated here, as opposed to HTML5 which is reportedly over 86% of market share and growing, as suggested in that same link, and also indicated here.



I'd be interested to know what percentage of the sites using HTML validated. If it doesn't validate it isn't considered to be XHTML.

Mine are static pages served up as:


```
<html xmlns="http://www.w3.org/1999/xhtml" lang="en-US" xml:lang="en-US">
<head>
<meta http-equiv="content-type" content="application/xhtml+xml; charset=utf-8" />
```


----------



## Vull (Apr 25, 2019)

Trihexagonal said:


> I'd be interested to know what percentage of the sites using HTML validated. If it doesn't validate it isn't considered to be XHTML.
> 
> Mine are static pages served up as:
> 
> ...


I still verify each dynamically-generated page using xml_parse() for XHTML syntax conformity, and most of my html-generating PHP scripts have been previously verified using the W3C validator, but I'll probably look at this further when I have more time and will most likely wind up adding a profile option for each user account to toggle the generation of application/xhtml+xml. This will provide a good exercise for figuring out what the important differences are, between the XML DOM and the HTML DOM, and to hopefully debug my ECMA conforming javascripts to sniff out the differences and handle both DOMs.


----------

