MediaWiki
master
|
Job to update link tables for pages. More...
Public Member Functions | |
__construct (Title $title, array $params) | |
getDeduplicationInfo () | |
Subclasses may need to override this to make duplication detection work. More... | |
run () | |
workItemCount () | |
![]() | |
__construct ($command, $title, $params=false) | |
allowRetries () | |
getDeduplicationInfo () | |
Subclasses may need to override this to make duplication detection work. More... | |
getLastError () | |
getParams () | |
getQueuedTimestamp () | |
getReadyTimestamp () | |
getReleaseTimestamp () | |
getRequestId () | |
getRootJobParams () | |
getTitle () | |
getType () | |
hasRootJobParams () | |
ignoreDuplicates () | |
Whether the queue should reject insertion of this job if a duplicate exists. More... | |
insert () | |
Insert a single job into the queue. More... | |
isRootJob () | |
run () | |
Run the job. More... | |
teardown () | |
Do any final cleanup after run(), deferred updates, and all DB commits happen. More... | |
toString () | |
workItemCount () | |
Static Public Member Functions | |
static | newDynamic (Title $title, array $params) |
static | newPrioritized (Title $title, array $params) |
![]() | |
static | batchInsert ($jobs) |
Batch-insert a group of jobs into the queue. More... | |
static | factory ($command, Title $title, $params=[]) |
Create the appropriate object to handle a specific job. More... | |
static | newRootJobParams ($key) |
Get "root job" parameters for a task. More... | |
Protected Member Functions | |
runForTitle (Title $title) | |
![]() | |
addTeardownCallback ($callback) | |
setLastError ($error) | |
Additional Inherited Members | |
![]() | |
string | $command |
array | $metadata = [] |
Additional queue metadata. More... | |
array | $params |
Array of job parameters. More... | |
![]() | |
string | $error |
Text for error that occurred last. More... | |
bool | $removeDuplicates |
Expensive jobs may set this to true. More... | |
callable[] | $teardownCallbacks = [] |
Title | $title |
Job to update link tables for pages.
This job comes in a few variants:
Definition at line 38 of file RefreshLinksJob.php.
Definition at line 46 of file RefreshLinksJob.php.
RefreshLinksJob::getDeduplicationInfo | ( | ) |
Subclasses may need to override this to make duplication detection work.
The resulting map conveys everything that makes the job unique. This is only checked if ignoreDuplicates() returns true, meaning that duplicate jobs are supposed to be ignored.
Implements IJobSpecification.
Definition at line 270 of file RefreshLinksJob.php.
Title | $title | |
array | $params |
Definition at line 74 of file RefreshLinksJob.php.
References $job, Job\$params, and Job\$title.
Referenced by WikiPage\triggerOpportunisticLinksUpdate().
Title | $title | |
array | $params |
Definition at line 62 of file RefreshLinksJob.php.
References $job, Job\$params, and Job\$title.
Referenced by LinksUpdate\queueRecursiveJobs(), and WikiPage\triggerOpportunisticLinksUpdate().
RefreshLinksJob::run | ( | ) |
Definition at line 81 of file RefreshLinksJob.php.
References $e, Job\$params, $wgUpdateRowsPerJob, as, Job\getRootJobParams(), global, list, Title\makeTitleSafe(), BacklinkJobUtils\partitionBacklinkJob(), runForTitle(), JobQueueGroup\singleton(), title, wfGetLBFactory(), and wfWikiID().
|
protected |
Title | $title |
Definition at line 130 of file RefreshLinksJob.php.
References $content, $page, $parserOutput, $user, LinksUpdate\acquirePageLock(), as, DB_MASTER, WikiPage\factory(), false, Title\GAID_FOR_UPDATE, Title\getLatestRevID(), InfoAction\invalidateCache(), Revision\newFromId(), User\newFromId(), User\newFromName(), Revision\newFromTitle(), Revision\RAW, IDBAccessObject\READ_LATEST, DataUpdate\runUpdates(), Job\setLastError(), ParserCache\singleton(), TS_MW, TS_UNIX, wfGetDB(), and wfTimestamp().
Referenced by run().
RefreshLinksJob::workItemCount | ( | ) |
Definition at line 284 of file RefreshLinksJob.php.