NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
The Unreal Engine Wiki is now permanently offline (forums.unrealengine.com)
sytelus 1467 days ago [-]
This is bad, very bad. A complete and utter disrespect for people who poured in their blood and sweat in creating the content for them and for free. Migrating only "top" content is insufficient and so ignorant. The value of wiki is almost entirely in long tail. Vast majority of Unreal engine is undocumented where you get all kind of issues few have seen before. If it wasn't for the community creating content for them, it would be unusable for many users. To salt the wound, they aren't even telling you exactly why they are doing this. Why making read-only archive is so hard, at very minimum. Didn't expect this from Tom Sweeney's company.
gentleman11 1467 days ago [-]
I posted recently about how bad ue4’s c++ documentation is, and how it drives people to Unity. They have been promising to make it better for 7ish years from what I gather in forums. What is with them
Aqua_Geek 1467 days ago [-]
Looks like the post got updated with some reasoning:

> Why can’t we put a read-only archive online currently? The Wiki, even in it’s read-only state, was presenting security risks, and it was deemed necessary to take it offline.

smacktoward 1467 days ago [-]
So crawl the Wiki pages, grind out static HTML copies of them, and make those available. Not many security risks associated with static HTML.

What am I missing here?

Ndymium 1467 days ago [-]
I did this for a forum I used to host for many years. I crawled versions for people (using a user with the same rights as they did so it wouldn't contain anything extra) with static assets and referred images and zipped that up, so they could browse the old posts locally and nostalgize. It worked really well.

Maybe the people at UE need to be taught how to use wget.

nitemice 1467 days ago [-]
The only other "security risk" I can imagine such a read-only wiki could present is if it documents something that could be considered "risky" from a security perspective. That would make this more of a censorship situation.
moonchild 1467 days ago [-]
> We still have the data, and as mentioned above, we will work to migrate the top content into various official resources. In the meantime, we’re investigating how we can make this content available, even in a rudimentary way.

This indicates otherwise.

conistonwater 1467 days ago [-]
I read it's very easy to that nowadays with Chrome's Puppeteer.
lstamour 1467 days ago [-]
onion2k 1467 days ago [-]
Static content needs a web server and a way to move the content to the server (ftp, sftp, rsync..), and ssl certification, and a DNS entry, and DDOS protection, and a way to manage credentials for those things. It's not zero risk at all. There are plenty of ways to attack a static site. The only way static is safer is because you're not executing scripts on the server (which is a massive win, I don't want to down play that aspect), everything else is the same as a dynamic site.

That's still a lame excuse though. They could have outsourced all the "hard" stuff to a free Netlify account.

mekster 1466 days ago [-]
Like what problem specifically that you cannot mitigate? If you can't just host static contents on a new empty server and get it going, how does the rest of the world work at all?

That is just an excuse.

nikanj 1467 days ago [-]
Turn both the DB and the disk read-only and your security issues should mostly be solved.
Kiro 1467 days ago [-]
How can it be a security risk?
stallmanite 1467 days ago [-]
This kind of crap is why I will always prefer a local archive and use the hell out of youtube-dl and similar tools.
oriel 1466 days ago [-]
I'm in a similar boat. What other similar tools have you found yourself using? For example, Ive been learning the hell out of wget, but I find my tool library lacking or lacking in trust.
Traster 1467 days ago [-]
The thing that seems super strange to me is that they don't seem to have warned people they would do this, the first comment is

> This isn't very helpful, Amanda! I know that the wiki wasn't optimal, but there were many wiki pages developers like me had bookmarked for years beacuse they contained comprehensive and easy information, which is now missing. Why not just keep the wiki read-only online? Just to retain the old pages? I'm pretty lost right now without some of these articles and I don't understand why the only option you had was to completely disable it. Please think about opening it up again just for read. I don't care about the maintenance mode, but the wiki was an important learning point, which is now gone.

If you don't want to support the wiki that's fine, you don't owe anyone hosting, but if you're going to dump it, atleast give someone the opportunity to scrape the site and host it themselves.

c3534l 1467 days ago [-]
Lesson I've long learned: never bookmark anything. Bookmarks are to temporarily remember a URL, not to archive content. If the content is important or meaningful to you, save the page.
JeremyNT 1467 days ago [-]
Yes this. Grab the 'singlefile' add-on[0] and train yourself to hit that instead of bookmarking. You'll be much happier if you ever need the information again!

[0] https://addons.mozilla.org/en-US/firefox/addon/single-file/

zelphirkalt 1467 days ago [-]
I'll bookmark that page, thank you!
nocman 1467 days ago [-]
I see what you did there. :-D

Naturally, you mean you'll SAVE that page, right? LOL.

zelphirkalt 1467 days ago [-]
Ah, right, of course, so that I can later access that page locally and install the extension whenever I need it. Then with that extension installed, I could even save that extension page!
hyperpallium 1467 days ago [-]
Firefox: reader view then bookmark saves the content.
ben0x539 1467 days ago [-]
im surprised at myself for shilling a service i dont even use and dont have any stake in, but https://pinboard.in/ lets you bookmark things and then also archive those things so they dont go away
jimmaswell 1467 days ago [-]
That site could do the same thing as the wiki.
goatsi 1467 days ago [-]
You can download a complete archive of all the bookmarks and pages: https://alexwlchan.net/2017/07/backing-up-pinboard-archives/
miladiir 1467 days ago [-]
I tried this in the past, but could not even get it to compile. Can you share your working setup, or have you just found this?
MaxBarraclough 1467 days ago [-]
Not impossible, but it's a big part of how the site makes money.
Causality1 1467 days ago [-]
Absolutely. When it comes to sites with mostly static content I have a habit of archiving them with HTTRACK, though I wish there was a more modern solution with better support for active content.
nikanj 1467 days ago [-]
Too bad modern single-page apps make it very hard to just scrape the site.
imtringued 1467 days ago [-]
For some inexplicable reason some pages add script tags with javascript. So what happens is that you archive a page. The page is gone and when you visit it you notice all the broken links pointing to javascript files that are gone as well.

Archiving even just a single webpage is non-trivial. You're dependent on the internet archive if you want a reliable backup.

themodelplumber 1467 days ago [-]
Such a solution presents a bit of a challenge though, given that you'd be 1) broadcasting a security issue, and 2) possibly compounding it by presenting your audience with some really disagreeable news.

I can at least see why they'd hesitate to leave things up, depending on the anticipated risk and likelihood of addressing it in reasonable time.

Edit: Downvote if it makes you feel better, but this is really how groups execute on problems like this without taking time away from other important projects. "Security issue? Extensive fixes needed? Take it down!"

loktarogar 1467 days ago [-]
"We're going to shut down the wiki starting xx/xx."

Don't need to explain, don't need to make a fuss - just announce and move on.

themodelplumber 1467 days ago [-]
Doesn't that also just open up a lot more community feedback/pushback? I'm thinking somebody saw it coming in any case and made the call.
geitir 1467 days ago [-]
Hackernews is shutting down in 45 seconds. Please save the pages of any bookmarks now
loktarogar 1467 days ago [-]
About as much as we have right now. The difference is people can at least make backups of the content they need.
vonseel 1467 days ago [-]
Can’t you just lock the system down and isolate it enough so the security vulnerability is a non-issue? Certainly there’s an ops solution to things like this.
djsumdog 1467 days ago [-]
wget or other site rippers can just make static content out of it. You can write a script to put a notice/header at the top and host it on nginx .. or an S3 bucket.
csdreamer7 1467 days ago [-]
I do not think your comment deserved to be down voted. That may have very well been their line of thought.

But I also think loktarogar provided a better solution.

And that is what a discussion forum is for.

tomc1985 1467 days ago [-]
They could provide a the public or a trusted third-party with a database dump, or put it on a separate hosting service. With a bit of effort either course would relieve them of security concerns
user5994461 1467 days ago [-]
I had to decom a major wiki in a very large company not long ago. I can give you four reasons why these things happen, possibly without notice.

1) The wiki software and database have been abandoned for years. There is no maintenance and no further release.

2) It will stop working shortly. Like, it doesn't start on Ubuntu > 14 or current MySql at all.

3) It already stopped working and/or the content is already lost. Can be an accidental deletion or the 10 year old server passed away.

4) Security vulnerability. Remote code execution / SQL injection in the wiki software. That can't be fixed because point one.

I wrote a longer blog post on software death cycle in companies https://thehftguy.com/2019/12/10/why-products-are-shutdown-t...

tempestn 1467 days ago [-]
It shouldn't be hard to save a static copy of the actual HTML and replace the dynamic site with that though. Or at least give others enough time to do the same.
user5994461 1467 days ago [-]
It's not hard, except when it is and it takes weeks of work. From my own experience:

1) httrack wasn't able to crawl anything for some reasons. wget only with special flags available in the latest version.

2) All the links are broken. Wiki have their own linking system between pages that are processed on the fly. The archive with all links hardcoded to wikidump.example.com/wikidump/page really doesn't work in place.

3) More challenges with saving pictures and attachments.

4) Unicode characters in some pages, that broke the page and/or the crawler.

5) Infinite crawling and a ton of useless pages. Consider that every URL with a query string is a separate page. /page?version=50&comparewith=49

6) Crawling large amount of documents takes forever. Could be an hour wasted on each try. Consider tens or hundreds of thousands of unique URLs to save, see point five. Really wish the crawler could parallelize and run on the same host as the wiki.

tempestn 1466 days ago [-]
Good points. Perhaps I should have said, "It should be very possible," or similar instead. I'm sure there would be challenges, but I would expect them to be surmountable.
dontbeunethical 1467 days ago [-]
Not that hard?

You'd need to process each page then data mine it.

smacktoward 1467 days ago [-]
HTTrack (https://www.httrack.com/) makes tasks like this trivial.
imtringued 1467 days ago [-]
It doesn't run a full browser engine so it won't work with the vast majority of websites.
eof 1467 days ago [-]
A company that can build a game engine should probably be able to crawl a site and save an html dump?
biggestdecision 1467 days ago [-]
Just archive.org every page on the wiki. Their api will let you do 15/minute.
tvbusy 1467 days ago [-]
Sounds like someone accidentally deleted it and they have no backup. Instead of admitting to not having a backup, they can just say it was intentionally shutdown, and ask their staff to salvage whatever is available from archives.
pfundstein 1467 days ago [-]
My first thought as well, but my second thought was why wouldn't they own up to it? Surely they know that owning up to something like this earns them much more respect and positivity from the community than "taking it down" for no good reason, or worse trying to cover it up.
andrewflnr 1467 days ago [-]
(a) that's a lot easier to say than do, fear is powerful and (b) no, they wouldn't get any respect for Not. Having. Backups.
cortesoft 1467 days ago [-]
No, my guess is it was hacked so they shut it down. That is probably why they say "security risk"
SparkleBunny 1467 days ago [-]
I doubt this.

"After over a year in maintenance mode, the official Unreal Engine Wiki is now permanently offline."

It seems to have been a problem for some time.

1467 days ago [-]
erichocean 1467 days ago [-]
I always wonder why companies do stupid things—like this.

At the very least, put it in read-only maintenance mode, with a big disclaimer at the top saying so.

But to just destroy information, information about your own product, is…well, it's stupid. Profoundly so.

Sophistifunk 1467 days ago [-]
In my experience this sort of decision is always driven by sales / marketing people deciding they want to funnel the users into some other part of the site that nobody currently uses because it's not as good.
tialaramex 1467 days ago [-]
Not necessarily sales/ marketing, but an Old Thing is not a new thing you'll be praised for, it's just another annoying cost that comes out of your budget. Maintenance is boring.

Building a New Thing comes with excitement and praise.

Microsoft has done this so long their own people strongly recommend using URLs in https://aka.ms/ their long term link maintenance software, so that when yet another "exciting" change happens to their entire Microsoft web site you can still find all the vital documentation. Maybe their "Knowledge base" articles for example, will become a Wiki again, and then a social networking site, and then a blog the week after, and then a different Wiki with newly inscrutable URLs. But the aka.ms link can be updated so that you don't need to spend an hour navigating.

The more important maintenance becomes to a company's actual financial health the more senior management rebel and become sure their destiny is to radically reinvent the company. If directors did their actual job (working for the shareholders, for whom "exciting" aka "massively loss making" isn't the goal) the very next thing you'd hear after the CEO announcing the company has a new name, new branding, new logo, would be the Chairman arriving to tell everybody actually it's not a new name, new branding or new logo, just a new CEO, sorry, no big leaving do he was escorted off site by security.

hnzix 1467 days ago [-]
Some marketing people cannot grok user-generated content and absolutely freak out over the perceived lack of control. It's completely irrational but I've had pinhead marketing droids kill my content before with no replacement plan. It's infuriating.
nikanj 1467 days ago [-]
"Your bonuses are tied to the amount of traffic your blog posts generate" is such a 2020s KPI. And I've heard of more than one person whose compensation package has those metrics.
craftinator 1467 days ago [-]
Bingo.
rwnspace 1467 days ago [-]
I think corporate upper management are the only echelon of business capable of such waste and cynicism, maybe they noticed that traffic was going to the wiki and not the dedicated support pages.
skissane 1467 days ago [-]
> But to just destroy information, information about your own product, is…well, it's stupid. Profoundly so.

From a purely historical/retrocomputing point of view, I'm really disappointed IBM took the legacy Library Server site offline.

It was full of fascinating detritus... old OS/2 manuals, various ancient layered products that IBM dreamed up once upon a time back in the 1990s or early 2000s and which never went anywhere (DDM, DCE, etc)...

Why couldn't they just donate the contents to archive.org or something like that, if they don't want to host it anymore?

At least they still have the offering information site online – https://www-01.ibm.com/common/ssi/ – full of product announcements from the 1980s. To pick some random examples:

https://www-01.ibm.com/common/ssi/ShowDoc.wss?docURL=/common...

https://www-01.ibm.com/common/ssi/ShowDoc.wss?docURL=/common...

Sadly, I'm sure sooner or later the old stuff from there will be gone too.

kalleboo 1467 days ago [-]
There's a similar deal going on with Apple.

They still host a bunch of old classic Mac/Apple II files, but they're not longer browsable, you have to find old links to them.

E.g, here's the first floppy image to install MacOS System 7.5.3 http://download.info.apple.com/Apple_Support_Area/Apple_Soft...

Their support site also still has a bunch of articles from the 90's in the database, you can still get them to show up in the searches on their own support site, but they're not browsable and hence not indexed by Google.

Anyone need help getting HyperCard 1.2.5 working on their Macintosh IIfx with the NuBus Macintosh Display Card 8*24 GC? https://support.apple.com/kb/TA46247?viewlocale=en_US&locale...

egwynn 1467 days ago [-]
By the looks of the most recent snapshot on archive.org[0], that’s what they did.

[0] https://web.archive.org/web/20200329185200/https://wiki.unre...

EDIT: It seems like the main problem is that they didn’t do a good job communicating their intention and timeline for removing the old wiki.

opencl 1467 days ago [-]
It had been in read-only maintenance mode for quite a while. Then they just took the whole thing down with no warning. The wayback machine copy unfortunately seems to be missing a lot of articles.
1467 days ago [-]
muststopmyths 1467 days ago [-]
Really stupid. UE4 documentation is generally crap and the wiki resources were quite invaluable. It might be outdated information but it at least gave you a starting point to figure out where to look in the source for more information.
jokoon 1467 days ago [-]
Since I started using Ogre3D I always had a hard time settling down to feature rich engines like unreal or unity.

I don't know how often, giving beginners access to a space shuttle, will it lead to a successful project that can compete with non-indie game developers.

There is also a fine line between an indie team of developers who can benefit from those tools, and experienced game developers who would not need them.

It seems unreal and unity are just very capable, but cheap, tools that are well-marketed towards students and beginners. The problem is, once those developers learned to use those tools, they are still unable to develop a game without those tools, which is a huge win for unity and unreal.

Generally I tend to believe unreal and unity only enable developers to make games that will never be able to compete with more established and skilled game developers. I think it's a pretty sad situation, because initially I really believe indie games were able to compete with those big studios, but they're not, and I think unity and unreal are responsible for this. It seems the whole FOSS mantra/paradigm/philosophy has a lot of trouble penetrating the field of game development, maybe because games are heavily monetized towards consumers, unlike other softwares. It bothers me.

fiblye 1467 days ago [-]
As someone who's written engines, worked in bare bones do-everything-yourself frameworks, and used Unity, not everyone has the time or ability to sit down and code almost everything from scratch. Some people have simple ideas that involve mostly gluing parts together and big engines are fine for that. Some can program everything if they want to, but they want a decent physics, rendering, and user input engine from the get-go so they can get to work on their ideas and not endless boilerplate code.

Most indie gamedev projects I've seen die involve people who are primarily programmers who get stuck in framework hell. They're endlessly trying to build and tweak basic features for a basic framework, when any major engine has all of those features out of the box or with a 5 second package download.

I've had to get myself out of the habit of doing everything myself, admitting defeat, and downloading existing packages. Oftentimes someone out there has something better than what I wanted to make and easily tweakable, saving weeks of time and still allowing me to give it my own touch.

Even for a total amateur who won't use 1% of the features, Unity and Unreal have two huge things: support and a community. You can ask a question anywhere about anything and generally there's someone who can help and even give a precise solution. That's huge.

imtringued 1467 days ago [-]
I like reinventing the wheel on the game logic side. The payoff for novel ideas is often pretty high. Not everything is a tiled side scrolling platformer or a top down RPG. However, with graphics it's the opposite for me. The amount of setup needed to even show a single triangle feels like a waste of time. You're tied to a single graphics API so you either have to write the code again and again for each API or you use an engine. The payoff is pretty low because spending hundreds of hours on a custom graphics engine doesn't make your game stand out unless you are very skilled.

Of course this doesn't mean you shouldn't write your own graphics engine for fun and as a learning opportunity but if your goal is making a finished game then you should avoid falling into this trap.

pjmlp 1467 days ago [-]
That was one of the reasons (among others) that I never went beyond demoscene like stuff, always went too deep into engine stuff, and never produced anything really interesting as a game.

Anyone that wants to do games should use a pre-made engine, if the game is at all interesting, people will play it, even if the technology sucks.

ironmagma 1467 days ago [-]
What engines are you suggesting make it easier to create those games that compete? My experience is that UE4 and Unity are both enabling of indie developers to make very high quality games. The only real limitations are how much effort you put into the art. UE4, while hard to code for, is still orders of magnitude less work than coding all the rendering, animation, and hardware logic from scratch. There are of course other engines, but they are either devoid of the features you need to compete with AAA titles, or have severe performance limitations.
jeremyjh 1467 days ago [-]
It doesn't provide everything that those engines have but Urho3d is a very solid engine that has been under development for many years.

https://urho3d.github.io/

Someone else already mentioned Godot, which I think is fantastic and a lot of fun for making 2d games. I have not tried to use it for 3d, I understand its new renderer is making it more competitive with more advanced engines.

edit: btw, I don't really agree with anything GP said. I just wanted to plug these two nice open source alternatives. I personally (as a casual hobbyist) did not really mesh with Unity or Unreal, for different reasons but I definitely understand what they offer to both beginners and serious businesses.

jokoon 1467 days ago [-]
The question is more about how to compete.

For example, there are few indie games that managed to get a lot of sales because they innovated in terms of technology or ideas, like factorio or minecraft. Yet those games are pretty ugly, they're not "HD", while there are too many indie games that are HD with high poly graphics.

Indie devs don't have money, so they obviously cannot compete on the art and content. They have to make games nobody is making, and not hesitate to make pixel art or low poly content to concentrate on the gameplay. They cannot compete with big studios on content. It's too time-expensive.

This is the most important thing that people forget about games: the content doesn't matter. 3D artists don't matter. Games are not CG movies. Unless you're Kojima or final fantasy, nobody cares about cinematics. Game developers must focus on the gameplay and stop advertising about fancy graphics. This is a never-ending problem of video games since the 3D era: too much focus on the graphics, and no efforts on the gameplay, game balance, game theory, mechanics, reward system, difficulty, game lifespan, etc.

> What engines are you suggesting

If you're not aiming for bleeding edges graphics, you can use an graphics engine, or make your own. For physics and other stuff, there are plenty of libraries. I'm just saying unreal and unity are not engines, they're framework/platforms. They impose too many contraints, and you can't do everything you want with them. Not to mention the IP or business side.

Xeronate 1467 days ago [-]
Also interested in this question. I recently started on my first game project after 4 years of being a professional C# dev and chose unreal because it seemed to be the quickest path to success for a small multiplayer arena battle game. Seems like coding my own networking layer, renderer, animation system, etc. would take way longer.
philipov 1467 days ago [-]
Do you think Godot is either missing necessary features or has severe performance limitations?
jayd16 1467 days ago [-]
Currently its missing necessary features. The roadmap looks good but I can't ship on a road map.
philipov 1467 days ago [-]
I've been looking at Godot for a hobby project. Could you please describe what features you need from it that it's missing?
jayd16 1467 days ago [-]
The biggest for me is the fact they're rewriting the graphics stack. The churn is enough but I also just don't like "fixed"/"simplified"/"helpful" tools that hide the underlying platform. Unity's shader language is extremely ugly but at least I can use raw GLSL if I have to. I've had to use custom pragmas to get certain acceleration features to work on Samsung hardware that doesn't seem possible in Godot. Hopefully the updates with Vulkan will have more flexibility.

That said, for a hobby project it seems fine.

mikst 1467 days ago [-]
Hi, I'm not very proficient in graphical programming, but godot docs say this

> Godot uses a shading language similar to GLSL ES 3.0. Most datatypes and functions are supported, and the few remaining ones will likely be added over time. Unlike the shader language in Godot 2.x, this implementation is much closer to the original.

https://docs.godotengine.org/en/3.0/tutorials/shading/shadin...

Godot is relatively new and definitely "not there yet", but at least with its open nature you can do `git clone godot-doc.git` and no top manager can take it away from you.

crocodiletears 1467 days ago [-]
I think you're discounting the value of the tools Unity and UE offer to experienced developers/teams, and ignoring that outside a few exceptions, Indies haven't really been able to compete with AAA Devs since the PS2 era, when games more or less got programmed from scratch, or were adapted from the brittle code of previous titles.

Nearly every major studio or publisher has a similar toolset they've either built from scratch (RED Engine, Frostbite, Anvil, Decima, Id Tech, etc.) or license (like Unreal) and built on top of. Years of testing, R&D, and workflow refinement goes into making these toolchains extensible and useful for teams of all skill levels and functions, as well as to make them scale well to the needs of different titles.

The trade-off with these tools, is that their tremendous breadth can make working with them on complex projects their own knowledge domain for smaller teams, even as it abstracts away many of the complexities that come with developing your own engine.

If you're an engineering oriented developer who has the luxury of developing for a very restricted set of platforms and the time to debug their own tooling, with narrow, well-defined graphical requirements, a clear vision, and a technically inclined art team, then using a framework like Ogre makes perfect sense. Lightweight frameworks are a joy to work with, and you only have to add what you need on top of them to get the job done.

But iteration is slower, and you may spend months getting your tooling where it needs to be if you're going to work with a team.

Good luck onboarding new artists and game designers though. First you have to worry about training. After that, compatibility. Artists tend to have a workflow that works best for them, and even using open file formats, and established toolchains, they've got a gift for finding edge cases in your system. Your map editing toolchain also has to work for both the artists, and the designers.

Conversely, a mature engine like UE, or Unity has a wealth of crowdsourced documentation, and it's almost impossible to trip over an issue that someone else hasn't already triaged before you. New team members are almost guaranteed to know how to fulfill their responsibilities within the constraints of the engine's toolset, so they can get to iterating on prototypes much faster.

They're also typically extensible enough that the engineering guy(s) can put whatever efforts they would have contributed to designing a rendering engine, tools, and debugging platform issues into adding features unique to their title.

The featureset on these behemoths may be overwhelming, but it's more or less on par with what the 'pros' are using, so just by adopting one, you're virtually eliminating your technical capability gap with them. There is still a gap. With respect to tooling, Indies simply lack access and experience with parametric modelling tools like Houdini which greatly increase the efficiency of content-generation.

The rest of that gap can be broken down to experience, and manpower. Experience can be fixed, but few indies are able to throw the number of bodies at a project that someone like EA or UbiSoft can.

Engines allow anyone to make AAA level experiences with AAA levels of graphical fidelity now.

The output gap has become about art and content, something no indie can effectively compete with in terms of volume.

I agree that developing on large engines can cause you to hit a wall, and the engine essentially becomes the developer's world, but I think overall the proportion of people in the world who go further is the same, even if the proportion of people in the world cluelessly noodling with the low-cost space shuttle they've been given, and putting out garbage increases.

People incapable of competing have been allowed to join the market. But the democratization of engines has also given those with the potential to be great a much lower barrier of entry onto the development scene

gentleman11 1467 days ago [-]
Excellent post, but in the context of this discussion, I disagree on one point:

> a mature engine like UE, or Unity has a wealth of crowdsourced documentation, and it's almost impossible to trip over an issue that someone else hasn't already triaged before you

C++ in UE4 is a nightmare to work with because you have to scour forums for a day and a half to find somebody who might have mentioned the name of the function you need. Great engine besides that, but that is a pretty big deal. Unity however - everything is the first page of search results. Incredibly valuable!

crocodiletears 1467 days ago [-]
Thanks, and quite right. An oversight on my part. Admittedly, programming for Unreal's well outside my realm of experience. I tried it some time ago, but mostly played with it as an art tool. Its coding conventions seemed more opinionated relative to Godot and Unity's, which turned me off, because I was only really looking for a fun toy in a domain outside my experience.
jokoon 1466 days ago [-]
> Engines allow anyone to make AAA level experiences with AAA levels of graphical fidelity now.

They allow it, but those indie devs don't seem to compete with AAA games. What's the catch then? I think that it's performance, game design, experience, etc. It's pointless to compete with big game studios on the same types of games.

> AAA level experiences

Sorry but what exactly is this? That's not what makes indie games interesting. And I don't think indie game devs will really achieve those "AAA" things.

Shorel 1467 days ago [-]
Maybe I am wrong, but I think Unreal engine will let you do anything and you can actually compete with the big studios, performance and feature wise.

The only reason someone can't actually compete, is man-hours available. Some things are just very time-consuming to implement.

But this last point applies to any engine, even the ones the big studios use.

About open source, just check Godot. And Godot is a competitor to Unity, in the 'learn this engine and you will have to keep using it' market. Which I think is fine for Indies.

ngold 1467 days ago [-]
As a noob, I've found learning c++ in unity pretty comprehensive. I can also look at how real code works looking through what others have done. And the youtube tutorial section is huge. At the end of the day I can't wait to know enough to jump ship to godot.
reilly3000 1467 days ago [-]
Unity is in a comfortable position with newer lines of business from VR, architecture, and animation along with its strong position on the long tail of desktop, console, and mobile gaming. They have the marketplace to beat and a userbase beyond comparison. I believe this enables Google-esq behavior and its disconcerting at best.

My son was really into Unity development for a while, but he got discouraged when they deprecated their entire networking stack without providing a suitable replacement (since August 2018) and are even removing support from old LTS releases.

For a multi-billion dollar company to suddenly take down a wiki that hundreds of man-months went into creating, that is visited millions of times each year, with no warning or archive- that is open user hostility. They can certainly afford to keep it around in read-only mode as a static site. An intern could run wget and have a mirror up in a few days tops. If there is unmoderated content they are worried about, they can afford to clean it up. This is wrong.

golergka 1467 days ago [-]
> they deprecated their entire networking stack without providing a suitable replacement

It was their second networking stack already, and both have had been ridden with problems. Last stack's still open sourced on BitBucket, and you can choose it as a starting point for your networking stack,.

Some promises, like what Unity networking was trying to achieve - a hassle-free real-time game mulitplayer without dedicated servers - are just not achievable, and your customers are better off if you admit it. It's much worse when you buy into marketing hype and start discovering structural problems that require total rewrite close to the release.

I worked with Unity since 2009, and in 2017/2018 implemented a custom multiplayer solution for an open-world RPG game without a dedicated server. Which was originally written on that exact stack. Never had a worse burnout in my life.

Damorian 1467 days ago [-]
Unity or Unreal?
PudgePacket 1467 days ago [-]
They're talking about both at different points..?
kick 1467 days ago [-]
I don't think so, hence the accusation of "Google-esq behavior" and such.

They seem to be a concerned parent who's mixed up.

reilly3000 1467 days ago [-]
I was mixed up about who owned the wiki, sorry folks. Unity networking is still broken. My son did try to move on to Unreal, but he never made anything of substance with it. He got interested in developing with 6502's and got out of 3D games for now.
dbuder 1467 days ago [-]
Oh god it'll be a z80 next
gsich 1467 days ago [-]
Have you read the link?
richardboegli 1467 days ago [-]
> So why can’t we put a read-only archive online currently?

> The Wiki, even in it’s read-only state, was presenting security risks, and it was deemed necessary to take it offline.

https://forums.unrealengine.com/unreal-engine/announcements-...

treve 1467 days ago [-]
Seems like a poor excuse. You can make a read-only version of a Wiki without running wiki software. Just mirror the HTML.

At the very least they could have made this open source

toomuchtodo 1467 days ago [-]
It would’ve been trivial to crawl the forum and dump the resulting WARC files into the Wayback machine to provide a permanent archive. This is just apathetic laziness on their part.

https://github.com/ArchiveTeam/grab-site

jsjddbbwj 1467 days ago [-]
Pretty sure the wayback machine already has a full archive
toomuchtodo 1467 days ago [-]
I haven’t enumerated all of the Wayback captures for the forum yet, so I can’t speak to how recent and complete the archive of the forum is.
daenz 1467 days ago [-]
I used the wiki extensively in my last UE4 project. It had its warts, but it also had valuable information that did not exist anywhere else. Taking this down without a torrent mirror or a grace period is phenomenally harmful to the community. Bad move!
bane 1467 days ago [-]
Looks like the waybackmachine got some of it at least

https://web.archive.org/web/20191212230615/https://wiki.unre...

rs23296008n1 1467 days ago [-]
I never understand why companies do this. Its very developer hostile.

Are they having financial problems? Surely Fortnite is keeping the lights on...

Could be a signal of underlying management confusion/instability. Might need to reassess.

axlee 1467 days ago [-]
You could host that wiki for ten bucks a month.
arvinsim 1467 days ago [-]
Pretty sure it's not a financial problem. As in their announcement, it is a "security issue"
rs23296008n1 1467 days ago [-]
Depends on traffic/content, but yeah text is cheap.

Might grab a copy of the archive for reference then local host it. We've got a ton of internal references that will be broken.

We haven't touched UE for about 10 months.

rs23296008n1 1466 days ago [-]
Not sure why someone disagreed without a supporting comment.

Every line is fact.

outworlder 1467 days ago [-]
Can't be due to financial problems with Tencent banking them, can it?
Zanderax 1467 days ago [-]
Tim Sweeney has a net worth of 7 billion. It's not about the money, it's bad management.
kevingadd 1467 days ago [-]
The lack of a static copy of the wiki really sucks but it's understandable that a mediawiki install would be pulled indefinitely. Mediawiki racks up a dozen CVEs in an average year and even a single one of those is an opportunity to perform watering hole attacks on every UE licensee. Getting RCE on a single UE customer's machine is an opportunity for million+-dollar industrial espionage - it's not uncommon for someone to get a copy of a game's source code and try to extort the developer for cash. We generally only know about the cases where the extortion fails...

It's possible that really aggressive security measures could mostly prevent that but even if you were to patch weekly that won't stop someone from pairing an undisclosed mediawiki attack with some other attack that isn't well-known. A game studio's machines are probably using LTS versions of Firefox or Chrome w/slower update cadence, which potentially means multiple days of vulnerability even after an exploit is patched.

Also, now that Epic processes credit card payments (Epic Store, etc) it's possible the mediawiki install would prevent them from passing PCI-DSS audits.

1467 days ago [-]
AA-BA-94-2A-56 1467 days ago [-]
Here is the Linking DLLs wiki page discussed in the forum thread:

https://web.archive.org/web/20181004001430/https://wiki.unre...

pcvarmint 1466 days ago [-]
uk_programmer 1467 days ago [-]
Microsoft did a similar thing with ASP.NET site. There were quite a lot of old articles for ASP.NET WebForms that were really good references or if you were working with someone that was new to WebForms you could just point them to a particular article and say "read through this, this has almost everything covered on how to do this".

Very frustrating.

bashwizard 1467 days ago [-]
That's a great way to get people to use Unity instead. Which everyone already should.
stolen_biscuit 1467 days ago [-]
Bonehead move. Leave it up as read-only and mark when pages are out of date so users can look for up-to-date information elsewhere. Hope they come to their senses and re-upload a read-only archive of the documentation
IXxXI 1465 days ago [-]
Khan Academy's internet traffic increased 250% over normal on quarantine.

Unreal Wiki must be experiencing similar trends. The real reason it was shut down.

misotaur 1467 days ago [-]
Kinda silly,good documentation is what, partially, will win the engine wars if we can call it that and Unreal is not exactly crystal clear.
mmm_grayons 1467 days ago [-]
It's a shame most people didn't hear about this; I don't suppose the archive team got any of this?
efficax 1467 days ago [-]
One of my first paid programming jobs was writing extensions to the Wiki engine used by the Unreal Wiki (TWiki).

RIP

rurban 1467 days ago [-]
Not twiki, looks like MediaWiki to me. Wonder what attracts people to MediaWiki anyway. Horrible and insecure code all over, easy to break into. Only maintainable with massive manual administration costs. And hundreds of Wikipedia editors.

On the opposite I once maintained phpwiki which never had any security problems, and all my known instances still work fine after decades. No much need for massive manual interventions. lots of admin plugins. XSS attacks impossible. I ran backup jobs for the DB (berkeley db was 30 faster than MySQL) and as HTML archive. So even if you have to put it down for php maintainance, you can trivially replace it with a readonly dump without any action buttons and without any PHP.

efficax 1467 days ago [-]
It was definitely TWiki in 2001. They must have migrated
p2t2p 1467 days ago [-]
Using any propitiatory/corporate systems feels more and more like living on a volcano.
gregjw 1467 days ago [-]
April Fools?
ericzawo 1467 days ago [-]
There's no backup floating around online, cached somewhere? Like waybackmachine?
friendlybus 1467 days ago [-]
Regardless of the reasoning the message for indies is clear: Time is running out.
jhare 1467 days ago [-]
"security risk" - So they're just trying to hand-wave a bluff at a huge community of developers? No one believes this. edit: also I feel bad for this community manager having to lie and apologize
Elv13 1467 days ago [-]
I am the co-maintainer of an OpenSource project called AwesomeWM. I took down our wiki years ago due to:

* Constant vandalism

* Dubious user created content rendering computer non functional

* Trolling edits to cause breakages to people copy/pasting shell commands

* SPAM

* Maintaining the wiki

Before that we forced users to log-in for edits, then forced moderator approvals for everything, then forced moderator approval for new account. Then gave up and retired the Wiki.

So no, wiki are not free content. They are a pain, especially when your community tend to have many trolls/hostile individuals like the gaming community. It's not "downright lies" all the time.

kroltan 1467 days ago [-]
Completely off-topic, but thank you for your work on AwesomeWM, together with all other contributors! It is a fantastic piece of software and I always use it on all my Linux installs.

About maintaining wikis, that is indeed a problem. In addition, most wiki software I used has extremely clunky administrative tools which make moderation way more challenging than needed.

I used to maintain a tiny private wiki for a previous job, and even in a very small operation (10s of users), it was a disproportionately large maintenance burden.

jhare 1467 days ago [-]
I agree with your points, and there are a lot of constraints for running a wiki. Especially on a volunteer basis.

Indeed it is not free content, the volunteers that edited the UE4 wiki must be pretty disappointed. But Epic isn't broke and the vague reasoning they offer is insufficient to me and many others.

p.s. Coincidentally I am a daily user of AwesomeWM. Thanks for your efforts!

phreack 1467 days ago [-]
They're of course allowed to refuse to maintain it, but why in the world wouldn't they keep a static read-only copy of it?!
Zanderax 1467 days ago [-]
Epic can't secure a wiki but they can secure the biggest video game in the world? Nah they just lazy.
rambojazz 1467 days ago [-]
What's the story here and why is this newsworthy?
1467 days ago [-]
terrycody 1467 days ago [-]
self-destruction measure completed
1467 days ago [-]
marta_morena_23 1467 days ago [-]
"We hear your concerns and frustrations. And we missed an opportunity to give you notice prior to the Wiki page coming down, and for that we apologize.

So why can’t we put a read-only archive online currently? The Wiki, even in it’s read-only state, was presenting security risks, and it was deemed necessary to take it offline.

We still have the data, and as mentioned above, we will work to migrate the top content into various official resources. In the meantime, we’re investigating how we can make this content available, even in a rudimentary way.

Some individuals from the community have already reached out to explore a community-hosted Wiki. We are following up with these folks and are evaluating next steps to see what may be possible here as well. "

Well you always learn new ways to express incompetence. They do know that you can render wikis into static HTML pages?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 07:40:56 GMT+0000 (Coordinated Universal Time) with Vercel.