Because you're staring at a homebrew template system right there.
Compare to jinja2 loop:
{% for item in navigation %}
<li><a href="{{ item.href }}">{{ item.caption }}</a></li>
{% endfor %}
You can use PHP like a template system, but it also means you can't move your stack to something else.
I have a personal site running with Pelican using jinja2 templates. If I wanted to, I could move over to Django but reuse my templates.
PHP is based around doing a request and getting a page as a response. You fill in forms, submit, and the PHP script that gets POSTed to gives you a new page. You can do a lot more than that, but PHP scripts being HTML files with spliced in serverside code screams this flow.
This is an outdated paradigm, as it pushes the programmer towards building static pages. Think about it, how would you make anything that interacts with the server without loading a new page? You're gonna have to write some JavaScript that hooks into what you've generated in PHP. When you then change the PHP code, you're gonna have to remember to update the JavaScript code as well.
The paradigm is has been shifting to web apps for years now. The server sends some JavaScript file that's responsible for rendering the entire page. With the client rendering the page, the server only sends and retrieves data, as opposed to web pages. The server doesn't render pages unless you want to shorten initial load times (this is called isomorphic js). This has tons of benefits
Page loading is way faster, because loading a different page is done clientside
It's super intuitive to have interactive shit, examples from most of the frameworks and languages implement Tetris, a simple Mario or flappy bird
Server endpoints are RESTful, and thus easier to understand. You write a request handler to deal with adding a person to the database and returns an empty 200 on success, instead of one that adds a person to the database and then sends you the next page that you're supposed to see.
I'm sure it's all possible in PHP, but you're gonna have to fight its fundamental design choices to get it.
This project isn't switching stacks anytime soon. I haven't run a CLOC in a while and already left the office for the week, but there's 10KLOCs of business logic, maybe getting into the 100KLOC range counting the persistence stuff.
Some applications naturally go for static pages, though. This particular project actually does quite a bit of actual write-once content, stuff gets written to disk on edit and then we just serve the stored copy. A lot of users wouldn't even notice if the database was down.
99% of the interaction in this app is "submit form, get back any validation errors and display them, else go to a new page". You're not going to get validation errors on a newly-loaded page, so there's no duplication of code there. The three other cases? The PHP sends back HTML. It's not very AJAX-y but it works fine.
If I were writing something more app-like, I totally agree that PHP would be a bad choice. I'm not particularly fond of PHP (even as of PHP7 it's kind of a sucky language, I've considered trying to create an alternative) but it does seem to be the best at the job it tries to do.
Also FWIW I don't do much of the front-end stuff. I stick to the database and business logic tiers. The other dev is the front-end guy. He did try doing it in Node as a full web app thing before I got brought on, but he decided it was a poor fit for the task.
https://files.facepunch.com/forum/upload/132541/3190a26c-1583-40a6-aa17-8cde435b916b/IMG_20190111_122143.jpg
Got to see Radeon VII in person today.
Also ran into JayzTwoCents, dude is tall asf.
Yeah, it sounds like a waste of time to recode the stack, especially with it being that big.
I disagree with write once websites naturally going for static pages, though. Why get the validation errors [I]after[/i] submit? Having them show up while you're writing is a much better experience. Also, a background process that backs up your progress every once in a while becomes a big stress relief once you have forms that take longer than a minute or so to edit.
Recently I've had to set up WordPress for a colleague, and besides the fact that it's a horrible piece of software that modifies its own damn install directory when you click the "install plugin" or "reinstall WordPress" button, it's slow af. I'd consider it to be a perfect "write once" example, but I reckon it would have been a lot less slow, more responsive and nicer to work with had it hypothetically been written as a web app.
Outside of PHP though, if you actually have a well built stack. A template processed on the server is much more faster than what can be done on a template that's processed on the client.
Like loading an entire page of a forum to presentation state in less than a second.
Sick of this bullshit because loading dynamic pages on my mobile phone, which has significantly less performance power than a desktop takes a long time, heats up the device and runs the battery (as it does on any battery powered computer).
Then you have the caveat of processing a webpack or whatever on every pageload.
Like, computers are getting more powerful, only for developers to make dumb decisions to make it degrade performance back down to unacceptable levels
The mindset of most of the web developers is just pretty much Todd Howard.
"It Just Works", regardless of how much bullshit there actually is
That can still be done. It has a stupid name, but isomorphic JS is the way to go there. The server renders the initial page. The client is downloaded separately and hooks into that page to add the interactivity. That will load the entire page of a forum in less than a second.
The server sending you a ton of shit to download isn't exclusive to web apps. Any website where people don't care about resource size will do that. Dynamic pages aren't necessarily slow either. Try running Elm's package website, Elm's TodoMVC, or Miso's Flatris. Those run pretty snappy on my phone. And the first isn't just a small example either, it's a full website.
https://files.facepunch.com/forum/upload/1755/be1872ec-c87a-4b00-9f5a-8b39c31dd786/image.png
Bless this checkbox
But these are some good examples of front-end done very very well. But in, what feels like a majority of cases, it's the polar opposite
Something like reddit feels okay with some caching, but then opening twitch embeds and it shits the bed, which I will probably just blame twitch's media player for perhaps.
video playing on the device should be hardware accelerated but idk. Kinda want to wrap this up since kinda busy :s
Yep, either this or skeletal page layout + incremental/lazy loading.
Because there's a lot of validation errors that would pop up during incomplete forms. Like, there's a form with optional volume and volume unit fields, but if you enter a volume, the volume unit becomes mandatory (we need to know if that "100" is 100 liters, 100 milliliters, or 100 cubic feet). If we just did background validation, we'd be popping up a validation error as they're tabbing to finish entering data, which is just distracting.
Also, we do use AJAX for form submission. So you do not leave the page on form submit, only when the JS gets back a response that the save was successful. We've taken great pains to make this site extremely fast - I actually set a goal that all pages (save a few where it's obviously impossible) should load in under a frame. 16ms from when the user clicks a button to when the browser finishes rendering the new page. We're meeting that performance goal.
We're not making completely static, javascript-less pages. We're just not making a monstrous web app that takes 8GB of RAM and takes a minute to load. (Seriously, after we spent the last thread bitching about Electron apps, how is everyone somehow now in favor of huge JS applications?)
There's a mile of difference between the two, lets be honest.
The browser isn't the slowest part of an electron application though, that's just the bloat
Today I have experienced first hand why backups are really important
I have a little project that I work on during free time, and a bug in an application broke my whole project file where nothing was salvageable.
Thankfully my paranoid ass does automatic backups every day and I was able to restore my project files from this morning, so only like two hours of work were lost.
One of these days I will torch Autodesk's offices due of their buggy and crashy software.
Honestly, I'm gonna attempt to my family to get some cheap NAS and run Veeam for computer backups, because you never know how much of a lifesaver it is when shit hits the fan.
Installed a new 1U PDU in my server rack. Didn't even *really* need it. But it's a tidy way to power things off that side of my room and it looks cool. Also can power more auxillary things with less connections out the back to my UPS.
It had a massive 15ft cord, so I cut that off down to about 5-6 feet and put my own plug on it. Nice and tidy.
Today, I moved my server (Haswell Xeon, 32GB RAM, Supermicro MicroATX board, 500GB SSD, 3TB HDD, Windows Server 2016 with HyperV) from an old Bitfenix full-ATX case to this short Rosewill 2U case. It's a tight squeeze getting everything in there since the drives overlap part of the motherboard. When I realized my front panel LEDs were backwards, I had to take out the HDDs to fix it. If I had put my 3.5" HDD in the second slot, the power cable would've prevented the CPU fan (Stock Intel cooler) from spinning. Despite all that, the drives and the CPU sit at 20°C. The 80mm case fans aren't too bad, but I ordered some Noctuas to bring the noise down a bit more.
When I end up buying a house (Soon! Thank fuck for cheap homes in Northern Ontario), I'll finally get around to buying a rack to be able to keep my rack mountable Mikrotik router, Mikrotik switch and now my server. All that's left to rack mount is my FreeNAS box, but cases with that many hotswap bays are expensive.. So I'll probably just keep the Silverstone case I have now, flip it horizontal and shove it on a shelf.
I'm fucking 26 today.
I am literally double the age I was when I registered my FP account in 2006.
Are there any browser plugins that can help me remember where I got a file from like sticking the page name in the filename or something? I can't believe I can't find any results for anything like this.
Chrome download manager should keep the URL of the file as long as it wasn't cleared.
Looking through HD Sentinal on my server. Two of the hardest working drives is my old ST4000VN which is their old NAS drives and amazingly an original ST3000DM001. Which either was shucked or was from my original batch of four from 2012. Has 1500days of spintime so it might have been a shuck possibly.
That ST3000DM001 has 172TB of writes on it since going into the server. And 1.45petabytes of reads. Bonkers. The NAS drive has just over 1petabyte and 225TB written.
That ST3000DM001 still is in perfect health too.
Sounds like my Spinpoint F3, it's still going strong.
Hi all, I'm having PC issues and I'm not sure if I should just make my own thread so I'm gonna post in here for now. My specs are:
i5-2400s @ 2.50Ghz
8GB DDR3 Ram
Windows 10
NVIDIA GeForce GTX 960 @ 2GB
ASROCK Z75 Motherboard
1TB HDD
To describe the crashes, my PC cuts to black and immediately reboots, no blue screens. Happens consistently when playing games, inconsistently when doing less intensive operations. Thanks!
Sounds like a power or heat problem. Happening more often during gaming is a big hint, that's when it will be drawing the most power and running the hottest. Thermal issues don't usually lead to instant power-off, they usually just throttle, but voltage fluctuations definitely can. I'd still check thermals first, since they're easier to fix - use Speedfan or some other program to look at what temperature your CPU/GPU/etc are running at, and if any fans have died. If the temps are high, do some dusting of the fans and heatsinks, that should help.
Otherwise, it's probably a dying PSU. I don't know any good ways to diagnose that, outside of either some crazy elaborate setup, or just swap it for a new one and see if that fixes it.
I keep a corsair PSU that I got on sale for $30 just as a known good test unit
So I was given the greenlight to use a recently decommissioned server at work for a new back-up archive server. All is good to go except they turned the server off when they finished migrating everything to the new ones and literally no one knows the login password for it... Like, the fact this server had 0 fucking downtime for literal years is a miracle because even corporate IT has no password saved for it and it isn't any of their old ones on file either.
So I guess I'm forced to reinstall the OS now, which I was probably going to do anyway but still.
New business venture: Rogue-like server hosting. No backups, no downtime.
Between Geo routing issues, I'm content with 99.95% uptime. Five-nines is basically a European Extreme no-damage speedrun.
I'm guessing this is a Windows server, since with Linux servers you would have ssh keys installed. Correct me if I'm wrong. Even then, still, seems like your work needs to work on its password management system. Passwords like that should not get lost.
I've wondered, what's it like to run a remote Windows server? How do you deploy it? Use remote desktop to double click installer files?
Linux servers don't necessarily mean SSH keys, but yeah. Major human error there, as most things are.
Remote servers are usually RDP, but Microsoft has this weird remote thing with PowerShell, and now with their OpenSSH addition, you can do that too. But it's dumb because everything in Windows is GUI.
Yeah they do it all via RDP.
Tbh I just ended up replacing utilman.exe with cmd.exe so I could change the admin password my self from the login screen.. It's uhh, a totally standard recovery feature implemented into the OS...
Turns out they actually volume licensed the servers too so fuck me and the horse I rode in on so whatever. Looks like we're doing it my way and using freeNAS unless IT will give me some license keys to use since I want to reinstall windows.
You use RDP, It's like running normal windows except no graphical niceities, and lag on your actions.
All the downsides of a server, with the downsides of windows. Also in my personal experience, RDP was harder to secure (if you didn't have a private network) than SSH is.
Sorry, you need to Log In to post a reply to this thread.